SAP CAP Recent Enhancements the HANA Dev Should Know About
If you keep up with the Release Notes section of the SAP Cloud Application Programming Model (or watch the SAP Developer News weekly); you’ve seen just how frequently new features are delivered for CAP. I’ve noticed a trend lately of some long-desired features for database developers (and HANA developers in particular) being delivered. If you are like me and came to know CAP development as the evolution of the HANA design time modeling approach, then you will find this little list of recent highlights interesting. This is by no means a complete look at all recently added features; but just the list of the ones that really caught my eye from a database developer standpoint.
Note: For all the examples I’m showing here, the source code is available in this sample repository: cloud-cap-hana-swapi/cap at main · SAP-samples/cloud-cap-hana-swapi (github.com)
If you work with the tables and views generated by CAP modeling directly in database tools (like HANA Database Explorer), then you might wish that comments in the CAP CDS model would be propagated into the underlying metadata. Now we have that option. Doc comments in the cds models will be placed into the database objects.
But first an especially important note for those that want to use this new feature. It’s not enabled by default. You need to configure the CDS compiler to turn this optional feature on. This can be done from the package.json file in your project. From the cds section of the package.json, add a “cdsc” section. This is where parameters for the CDS Complier belong. Then add the docs: true option to this section.
Then simply add your Doc Comments to the model definition. Upon next build/deploy you will see these transformed into COMMENTs in the target object definitions.
Schema Evolution is another exciting feature finally made generally available to HANA database targets in the November 2021 release of SAP CAP. Schema Migration uses HDBMigrationTable to avoid costly drop and shadow table operations to adjust target tables. Instead, CAP will track your changes and automatically create a migration script in the hdbmigrationtable artifact. This allows for fast and non-disruptive incremental changes to the target artifact.
This help documentation details all the features of Schema Evolution, but let’s look at a small example in the Star Wars code sample.
First another stop off in the package.json file where you can control overall settings for the Schema Evolution via the cds.hana.journal section of configuration. I’ve set my project to not enable dropping. This allows manual resolution of any conflicts that could result in data loss. I also have change-mode set to alter. This also ensures no data loss.
For any entity that we want this Schema Migration activated, we need only add the @cds.persistence.journal annotation. Any compile/build after adding this annotation will automatically track changes and journal these changes into migration steps in the target hdbmigrationtable artifact.
At deploy time HDI will compare the current version to the target version defined in the hdbmigrationtable and then only apply the migration steps necessary for the delta between the two versions. Yet the definition in the hdbmigrationtable isn’t version specific. It contains all possible changes over the lifetime of the target artifact.
Another new, powerful database feature added to CAP in the December 2021 release was the support for Window Functions. From the HANA online help: “Window functions allow you to perform analytic operations over a set of input rows.” This can assist with complex aggregations and rankings across multiple groups of data in a single entity.
Window Functions and the Window Specification – SAP Help Portal
No special configuration needed to access this new feature. It’s simply newly supported syntax. We can add the over/partition by syntax to a view column definition.
I don’t have a lot of numeric based data in this sample model but with this view I can easily rank which home world has the most Star Wars characters.
One particularly important note directly from the online help that I think is worth repeating here:
No semantic checks are performed for the window functions, they are simply translated to SQL. Consult the documentation of your database for more information about the supported window functions.
I experienced this myself while testing. A few times I was able to produce some syntax with the Window Functions that resulted in a successful cds build but then failed when I deployed to the HANA Database.
This last feature is still very much in Beta, but one that I find the most interesting. Within the change log for CAP in the December 2021 notes; you will find a reference to a beta feature named assert_integrity.
During my time in SAP HANA Product Management, this was probably the most requested feature in regard to CAP (and HDBCDS before it). Instead of only enforcing referential integrity at the application logic level, the framework could generate database constraints and foreign key checks for any associations. And that’s exactly what this beta feature is beginning to deliver in SAP CAP. As this is a beta feature, it’s subject to change and should not be used in production. But let’s look at the current state of the implementation as described in the change log.
Again, we need some configuration in the package.json to activate this feature. This time in the cds.features section we can use the assert_integrity to control where the integrity checks will be performed. Switching to db is all that is needed for database constraints to be generated by our model.
With just that configuration change any 1:1 Association or Composition will now automatically generate hdbconstraint database artifacts as well when performing a compile/build.
really nice overview of recent features added in the CAP database domain, thank you 🙂
One more cool thing about the database constraints is the cascading delete rule for
will result in:
Note that the existential dependency got detected here and the
ON DELETErule changes accordingly. This will significantly boost the performance for delete cascades. Of course this also works out of the box with composition of aspects.