Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
jeroenvandera
Contributor

So another week gone and This week the course was set up quite different from the first week. Was the first week very theory oriented explaining all the architectures. This week had a large hands-on part. I spend two nights doing all the practical exercises.

Basically in the first week you get the explanation of where all the tools in the shed are and what they are for.

I’m more the kind of guy that just stumbles into the shed, doesn’t even turn the light on and starts rumbling and trying until I get a grasp of things. And what happened to Richie? Oh well, guess he will do other parts.


(very) roughly you could say that the course this week was divided into two parts. First working with the .hdbdd file where you insert all the definitions of your database artifacts. First you can define you types and structures and using those types you can create tables, views.

In your .hdbdd file you use CDS syntax. Something that you can also yuse in HANA/river and ABAP. That’s nice as things I learn now are also usable in a wider area. Thing is though that as an acronym .hdbdd leaves something to be desired. I was impressed that Thomas was able to say this without stumbling in over 95% of the time. Behind my desk at home I tried it, but my percentage was way lower.


As someone with a lot of BW experience I was thinking about those types and structures and how you might be able to use them as sort of limited info objects. I would like first though that the functionality of export/import of tables is implemented.

Maybe I should explain that some more. When you update types or structures. All tables that hold that type/structure will also be updated. This can lead to activation errors if the table is holding data. Thomas said that some automatic export/import functionality is being worked on that when activating your table data will be exported. Update takes place and then data gets imported again.


That sounds like something that is really handy. Although you might want to be careful with changing your types and structures. A much used type (business key anyone?) that resides in large tables could create an avalanche of runtime actions to be taken.

In general though I like CDS, but I would like it to be SQL complete.

For the models We had an impressive example with lots of calculation views (graphical and scripted), a decision table and analytical / attribute views. In the scripted calculation view we used the procedure that was generated through the creation of the decision table.


I was wondering though if you could insert overlapping conditions in the decision table and if you could general multiple outcomes for one record. Does anybody know this or should I just try out. I suppose it is the latter.

In my notes I started this segment bravely by stating I would type everything and not cut & paste things, but due to the size of the exercises I had to tone that down a bit.

Additionally I had to revisit earlier parts I skipped as I had to create some roles and authorize the system user so the record create & test app could run.

All in all a good week, a lot more time was spent actually working in the HANA system.

Next week we will go into more SQLscript advanced. Let’s see how that is continued from last year’s course.

Labels in this area