Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
VolkerSaggau
Product and Topic Expert
Product and Topic Expert

Problem:

Using CAP CDS (Cloud Application Programming - Core Data Service) one would expect that this is working with HANA Cloud out of the box. That is unfortunately not the case. But with a little switch things are getting better. In the beginning CAP CDS for HANA was generating HDBCDS. This was also seen as the future for HANA Cloud at a time where there was no HANA Cloud. Today we know better and do not have HDBCDS in HANA Cloud. This leads to compatibility issues. On HANA 2 and HANA Service this was not a problem. Now customers want to migrate to HANA Cloud.

Self-Service-Migration helps you today to migrate the database content from the source systems to HANA Cloud. But you also want to migrate your development (also called design-time objects) into the new world to continue and extend work you started several years ago.

Self-Service-Migration assistant is indicating errors in HDI containers due to the usage of HDBCDS artefacts. If you try to migrate this to HANA Cloud you will end up in problems since HANA Cloud does not have plug-in for HDBCDS. Here a full list of unsupported plug-ins.

Solutions:


You have to convert your CAP project from HDBCDS generation to HDBTABLE generation. This is "just" a setting for CDS in the project. The CDS is smart enough to generate the same outcome in the database at the end. Only that HDBCDS is more a logical data model than a pure data model. Final result as so call run-time object in the database are for CAP applications equal.

Changes in .cdsrc.json (the JAVA style solution)

"requires": {
"db": {
"kind": "hana"
}
}


"hana": {
"deploy-format": "hdbtable" 

}


Alternative do:


Changes in Package.json on project level (Nodejs style)

 

"cds": {
"hana":{
"deploy-format": "hdbtable"
},
"requires": {
"db": {
"kind": "hana",
}
}
}


An the winner is ? The Package.json is the stronger definition. In CAP CDS it does not really matter where you give the directive but it has to be clear. So Package.json does overwrite all other formats (if it exists). My personal recommendation is to use PACKAGE.JSON. By this you are always sure what you get. The CAP CDS generator will now produce you the proper artefacts that will run in HANA Cloud as well.

If you now have switched the CAP generator from HDBCDS format to HDBTABLE format on thing remains. Clean up!

HDI deploys all the artifacts into the database and typical calculated the difference from before and afterwards.

Now since you before deployed the HDBCDS artifacts they are still within your design time storage.

Within WebIDE/BAS this is not a problem since they attach the option

- - auto-undeploy

(see NPMJS documentation and search for "Delta Deployment and Undeploy Allowlist")

to the deploy command automaticly. But the xs/cf deploy does not do that.

So it is strongly recommended to maintain an undeploy.json document.


 

 

[ "src/**/*.hdbcds" ]


 

Why you shoud do this?:

1.) You may get errors due to double definition (HDBCDS and HDBTABLE) of objects even if you not migrate.

2.) The self-service-migration will warn you in the pre-migration checker. This will eliminate this warnings and ensures a save migration.

 

Summary:


Changing the CAP CDS generation directives will give you code that is also executable in HANA Cloud.

Do not forget to clean up your code with the "undeploy.json" file. This will allow to prevent errors in unattended environments using batch or CTS+ or CI/CD environments.

2 Comments