In this blog, we will collect the various smaller blogs that detail all the new developer related features in SAP HANA SPS 11.  This will be a “living” document which is updated as new blogs are released.


Overview

In this blog series we are going to describe a large number of new features in both the underlying HANA infrastructure and in particular in the custom development aspects of HANA native development.  But before we get into describing all the new features, its important to note these innovations are delivered alongside the existing functionality.

We haven’t removed or disabled any of the current architecture.  All of your custom development objects remain exactly where they are today and will continue to function as they do already. The current XS Engine remains a part of the HANA infrastructure, although now renamed XS Classic so as to distinguish it from the new capabilities delivered as part of XS Advanced. Likewise the HANA Repository remains in place even as we move to Git/GitHub as the future design time/source code repository. Eventually these older capabilities will be removed from HANA, but that point hasn’t been decided yet. SAP won’t removed them until we see a critical mass of customers moving their development objects to the new capabilities we are about to describe here.

Therefore customers can upgrade with confidence to SPS 11 without fear that the new innovations will somehow disrupt their existing applications. Customers can decide how and when they want to begin to move applications to the capabilities and only do so once they are comfortable with everything involved.  In the mean time everything they have continues to run exactly as it does today.

One final note:  in addition to this blog series, we also have recordings of the lectures on this topic from SAP TechEd 2015.  Much of the information and announcements are the same between these lectures and this blog series.  So for those of you who prefer videos to reading, we have you covered as well.


Application Server

In this blog we will look into detail at the addition of the SAP HANA extended application services, advanced model (or XS Advanced/XSA for short). 

SAP HANA SPS 11: New Developer Features; XS Advanced


In this blog we explore Node.js as the new JavaScript programming environment for XS Advanced.

SAP HANA SPS 11: New Developer Features; Node.js


Development Tools

Although the SAP Web IDE for SAP HANA isn’t shipped initially with SAP HANA SPS11, developers can already start working with XS Advanced using 3rd party editors and command line tools.  This blog will look at the various options for starting development on XS Advanced now.

SAP HANA SPS 11: New Developer Features; Tooling – Getting Started


In this blog we introduce SAP Web IDE for SAP HANA. This is additional tooling for SPS 11 which ships as of March 29th 2016.

SAP HANA SPS 11: New Developer Features; SAP Web IDE for SAP HANA


Application Lifecycle

This blog introduces the new HANA Deployment Infrastructure (HDI) and the changes it brings to doing database level development.

SAP HANA SPS 11: New Developer Features; HDI


Database

In this blog we will explore new syntax and features for using CDS at the HANA database level.

SAP HANA SPS 11: New Developer Features; HANA Core Data Services


This blog demonstrates the new features in SQLScript in HANA SPS 11.

New SQLScript Features in SAP HANA 1.0 SPS 11


Source Code

I’ve published the source code of a few sample projects in GitHub that you can access here:

  • https://github.com/I809764/DEV162A
    • This project is an adapted version of the DEV162 TechEd workshop.  The original workshop was given on standalone Node.js on a local developers’ laptop. This new version is adapted to run as micro-services on XSA itself. The console output had to be adapted to use Web Sockets and run behind the App Router.
  • https://github.com/I809764/DEV602
    • An attempt to create a simple example project, yet one that contains many parts (database, node/xsjs services, and SAPUI5) as well as realistic security. It had security scope, role collections and database roles with structured privileges.
  • https://github.com/I809764/xsa-workshop
    • These are the workshop utilities we use within SAP HANA PM to teach HANA development workshops. If you’ve ever taken one of our TechEd sessions or our openSAP courses, you’ve used the Code Template Website. This is a new version running completely on XSA. It no longer uses the HANA Repository to store templates. The template service is now written in Node.js and stores the templates on the file system as part of the deployed application content.


Closing

In this blog we will look at when the functionality described here will be available and what the SAP recommendations are for adopting it.

SAP HANA SPS 11: New Developer Features; Closing

To report this post you need to login first.

18 Comments

You must be Logged on to comment or reply to a post.

  1. Anindita Bhowmik

    Hi Thomas,

    It is recommended that SAP HANA should be located in a secure network zone with minimal connections to other zones.But,”HANA XS” component on the same HANA box exposes hana objects over a REST based mechanism.

    Hence,is it possible to deploy/install  “HANA XS” component on separate server i.e other then HANA Database server?

    Or is using decoupled application servers which connect SAP HANA using data sources the only provision to add custom authorization layer to handle secure access to HANA XS services?

    Regards,

    Anindita Bhowmik

    (0) 
    1. Thomas Jung Post author

      >Hence,is it possible to deploy/install  “HANA XS” component on separate server i.e other then HANA Database server?

      Yes this is exactly one of the new architectural features of XS Advanced in SPS 11.

      (0) 
      1. Anindita Bhowmik

           Hi Thomas,

        We are on SAP HANA Version SP09 Revision 97.In this version of SAP HANA can we  introduce the  web server tier  to intercept the rest calls & if possible,  add customized security logic ,  which would eventually “redirect” this request to the HANA XS layer which resides in the DB tier.  Can you comment, if such 2/3 tier set up would be optimal with SAP HANA Version SP09?

        Regards,

        Anindita Bhowmik

        (0) 
  2. Nehal Fonseca

    Hi Thomas,

    Just curious. With the launch of XS Advanced, will XSODATA services be considered to be a part of the Classic XSJS lot. Will additional innovation happen in this area? Or will the new NodeJS/Java/C++ approach be preferred in building applications?

    Nehal.

    (0) 
    1. Thomas Jung Post author

      We have a new XSODATA implementation in node.js as part of the XSJS compatibility module. We also have an OData implementation in the Java runtime based upon Apache Olingo.

      >Will additional innovation happen in this area?

      Yes but it is highly likely that only the XSA version of XSODATA will receive enhancements. Don’t expect any further development to be done to the XSC version of XSODATA.

      >Or will the new NodeJS/Java/C++ approach be preferred in building applications?

      Yes, although we are at a point where you might not be able to start with XSA fully yet. There are still feature gaps.  But as we close these gaps, customers and partners should being only doing new development in XSA and begin converting their XSC applications to XSA. We aren’t going to take XSC away overnight, but it will go away at some point and be completely replaced by XSA.

      (0) 
  3. J. Jansen

    Hi Thomas,

    Excellent presentation as always. Is there any way that SP11 will be available on a trial basis, like in the Cloud Appliance Library any time soon? At our company, there is a clan of Node developers who would love to get their hands dirty on the new capabilities of XSA in combination with the power of HANA.

    Kind regards,

    Jeroen

    (0) 
    1. Thomas Jung Post author

      We are working to get the HANA Developer Edition updated to SPS 11. I was just testing the internal version the other day. I can’t say what the timeline is for the release, but we are getting close.

      (0) 
  4. Lochner Louw

    Hi Thomas,

    I’m curious if deep inserts will be supported in SPS 11 via XSODATA. I know from previous replies in the forum a lot of people posted sub items via Link Write Operations and some like me tried and failed to do POST on the navigation links for 1:n associations.

    Thanks for the excellent overview of SPS 11 Features

    Kind Regards,

    Lochner

    (0) 
    1. Thomas Jung Post author

      Deep insert is already supported via Link operations as you described.  What additional feature are you wanting here?  Or is it just that you had some problem using Link operations?  Did you enter a ticket for your problem?

      (0) 
      1. Lochner Louw

        I haven’t opened a ticket yet or a thread on the forum as it is something that I have been struggling to do since last week Thursday. I’ll open a thread with the full details today when I get to work.

        What I would like to do is to post everything in one go instead of doing a batch call or several consecutive calls.

        From what I understand with the link operation the principal and dependant need to exist before you can post the link write operation (correct me if I am wrong )

        My example is where is want to create an entity called Work with associations called Characteristics and Dates. I also want the whole transaction to fail if one of the entries in “Characteristics” is invalid. The JSON payload is something similar to:

        {

        “id”: “1”,

        “type”:”CLM01″,

        “description”: “Claim”,

        “Characteristics”: [

        { “type”: “ActionId”, “value”:”ID123″ },

        { “type”: “Keyword”, “value”: “OVER21” }

        ],

        “Dates” : [ { “dateType”: “CREATION”, “dateValue”: “\Date(123445678)\” } ]

        }

        (0) 
        1. Thomas Jung Post author

          >From what I understand with the link operation the principal and dependant need to exist before you can post the link write operation (correct me if I am wrong )

          No, not really. That’s the whole point of the batch operation. The create of the principle, dependent, and the link operation are all sent to the server in one batch. You use placeholders for the keys of the principle and dependent since they won’t be generated until the server side. However the server side knows its in a batch operation and will pass those keys into the link operation and replace the placeholder. Have you had a look at the sample implementation odataDeep in the SHINE content?

          >The JSON payload is something similar to:

          We do not support nested entities in a single operation.  You must use the batch concept.

          (0) 
  5. Denica Baeva

    Hi Thomas,

    I am trying to find out more code migration on SPS 11 but so far I am not able to answer the following questions:

    — Is the Git integration available with more enterprise flavors like Git Stash?

    — Is it still recommended to move Graphical Calculation Views with the existing Native Hana Transport or using Git?

    — Does Git handle the activation of the views or will developers need to manually activate them in Hana Web IDE one by one in a similar fashion as Developer Mode?

    — Similarly for HDI objects, since they won’t have version control, how does Git handle the activation (that creates the actual database objects)?

    (0) 
    1. Thomas Jung Post author

      > Is the Git integration available with more enterprise flavors like Git Stash?

      We use the basic Git APIs. This works with Git and I know internally we also use GitHub. I couldn’t comment on GitStash but that isn’t something SAP has tested with.

      > Is it still recommended to move Graphical Calculation Views with the existing Native Hana Transport or using Git?

      Once they are migrated to HDI you can’t use the Native HANA Transport in HALM. HALM only works for repository content.  You must use Git and/or manually deploy using MTARs.

      > Does Git handle the activation of the views or will developers need to manually activate them in Hana Web IDE one by one in a similar fashion as Developer Mode?

      Git has nothing to do with activation or deploy. Its only a source code repository.  When moving from one system to another you should deploy the MTAR file. It performs the necessary HDI deploy step (there is no activation any longer).

      >Similarly for HDI objects, since they won’t have version control, how does Git handle the activation (that creates the actual database objects)?

      Same as above – Git has nothing to do with this.  Activation of HDI objects happens during deploy of the MTAR which for an HDB module will call the HDI deploy service via Node.js.

      (0) 
      1. Nehal Fonseca

        Hi Thomas,

        Picking on the question and answer Ive copied below, will you be able to clear this doubt?.

        Is it still recommended to move Graphical Calculation Views with the existing Native Hana Transport or using Git?

        Once they are migrated to HDI you can’t use the Native HANA Transport in HALM. HALM only works for repository content.  You must use Git and/or manually deploy using MTARs.


        WIll not modeled views continue to be repository content? I assumed they had nothing to do with XS and therefore SPS11 XS Advanced. Are these also to be managed with Git? Same with HDB procedures?


        Additionally, what about the granular control that HALM provided with change IDs. WIll MTAR files provide this?


        Nehal.




        (0) 
        1. Thomas Jung Post author

          >WIll not modeled views continue to be repository content?

          No because there is no such thing as a repository in the new XSA/HDI world.

          > I assumed they had nothing to do with XS and therefore SPS11 XS Advanced.

          Incorrect. Calculation Views are HDI objects like all other database level development.

          >Are these also to be managed with Git?

          Yes.

          >Same with HDB procedures?

          Yes.

          >Additionally, what about the granular control that HALM provided with change IDs. WIll MTAR files provide this?

          No. An MTAR is always a complete archive of all development objects within a project. 

          (0) 

Leave a Reply