Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
LucaToldo
Advisor
Advisor
HANA 1.0: from professionals to professionals

HANA is a powerful integrated platform for high performance scalable and high quality services. Being implemented witih highest standards of quality, it never breaks and can really stand to the zero downtime requirements of mission critical businesses.

HANA 2.0 democratization : from data centers to Cloud

While at its beginning targeting customer data centers, HANA has been expanded to natively integrate in the cloud. This enables businesses to decrease their IT CAPEX investments by shifting to subscription models, and focus their human resources on business tasks beyond managing local data centers. The first experiences of HANA on cloud have been performed on SAP Cloud, thus enabling shortest path between SAP operations and SAP developers.

HANA Service: leveraging on Hyperscalers to facilitate intelligent enterprises

Proven its stability and performance on SAP data centers, hyperscalers such as AWS, Google and Microsoft has been enabled thus providing to customers freedom to leverage on their existing cloud provider and facilitating them to increase their benefits in using HANA on Cloud.

HANA Cloud: lower costs and highest flexibility

In a relentless effort of enabling customers and partners in decreasing the barriers toward intelligent enterprises through SAP Hana technology, SAP Hana Cloud is now being offered. This means much lower costs and higest flexibility, however with different capabilities in some area compared to SAP HANA Service.

SAP Cloud Application Programming
is an open and opinionated, framework of languages, libraries, and tools for building enterprise-grade services and applications. It guides developers through proven best practices and a great wealth of out-of-the-box solutions for recurring tasks. CAP-based projects benefit from a primary focus on the domain.

Core Data Services (CDS)
is the backbone of the SAP Cloud Application Programming Model. It provides the means to declaratively capture service definitions and data models, queries, and expressions in plain (JavaScript) object notations. CDS features to parse from a variety of source languages and to compile them into various target languages.

The approach of going from business model to service is very valid however there are cases in which this cannot be fulfilled, for example in case of tech-refresh induced replatforming.

Use Case - replatforming back end
This is what happened to me recently: a UI5/Python/Postgresql POC internal prototype in 2018 needed to be rapidly industrialised.That implied moved to UI5/JAVA/HANA without major changes to the user interface.

A full replatformnig was needed, and HANA platform with Java microservice was considered the safest way to acheve the goal.

CAP was at first assessed, however due to the constraints we had, a CAP implementation would have been difficult to be achieved. Therefore we had to follow a Java-native implementation, however cloud-savy.

HANA on Cloud ... which one ?

When one logs into the Marketspace of SAP Cloud Platform and looks for ways of using HANA then is suddenly confronted with the difficult choice: which to use ? Yes, in fact there are (at the time of this writing) three ways to consume HANA on SAP Cloud Platform:











  1. SAP HANA Cloud (hana-cloud) is "HANA Cloud"

  2. SAP HANA Schemas & HDI Containers (hana) are HDI containers or schemas on either SAP HANA Service or HANA Cloud, depending on the application target database

  3. SAP HANA Service (hana-db)  is "HANA as a Service"


The technical differences between hana-db and hana-cloud are described here.


Our  HANA specific code (both DDL and DML) consisted mainly of TABLES, VIEWS and STORED PROCEDURES, and did not need to use any special feature, therefore hana-cloud was the best choice for the project, given that it has also the lowest operating costs.

  1. HANA Data Definition Language -operations ( e.g. how to create objects in HANA )
    The service we had to reimplement needed to be fully compliant with the cloud qualities. Therefore it had also to be automatically deployed through a Pipeline. No manual operations permitted. For this purpose, I could have written Java code with many  "SQL CREATE" statements, however HANA provides a better way: using node and the @Sap/hdi-deploy one can easily automatically inject objects (TABLE, VIEW, STORED PROCEDURES, SYNONYMS ... ) into the newly created schema. The same mechanism allows as well to load the tables with files storing Comma Separated Values.

  2. HANA Data Modification Language - operations (e.g. how to access HANA from Java SpringBoot)
    The microservice we wanted to deliver had to be implemented in Java, and therefore the use of SpringBoot was a natural choice, as well as JDBC for connecting it to HANA database component. However, how to do it properly ? Infact, initial efforts led to the famous error message
    com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [4321]: only secure connections are allowed

    The solution to the problem was easily resolved by using a bean that used the HANAServiceInfo to get the required connection parameters and then creating a DataSource object with a URL pointing in secure way to the appropriate schema.


    @Configuration
    public class DataSourceConfig extends AbstractCloudConfig {
    @Bean
    public DataSource getDataSource() {
    HANAServiceInfo serviceInfo = (HANAServiceInfo) cloud().getServiceInfo("my-hdi-container");
    String host = serviceInfo.getHost();
    int port = serviceInfo.getPort();
    String username = serviceInfo.getUserName();
    String password = serviceInfo.getPassword();
    String[] parts = username.split("_", 2);
    String schema =parts[0];
    String url = new UriInfo("jdbc:sap", host, port, null, null, null,
    "currentschema=" + schema + "&encrypt=true&validateCertificate=true").toString();
    return DataSourceBuilder.create().type(HikariDataSource.class)
    .driverClassName(com.sap.db.jdbc.Driver.class.getName())
    .url(url)
    .username(username)
    .password(password)
    .build();
    }
    }

    That bean was then autowired to the @Repository bean to provide it with the required connection object.

  3. MBT and deployment to both hana and hana-cloud services
    The possibility that one day the application would need capabilities specific to hana-db had to be addressed, minimizing the amount of coding effort and avoiding different code lines. Also this was easily achievable, by appropriate configuration of the service component connecting to the HDI containers. In fact, both hana-cloud and hana-db rely on HDI containers to handle the specic schema objects. All HDI objects are created by the node @Sap/hdi-deploy. In the example below, since I had more than one database registered in my CF space I had to specify the database_id. One has to specify a database_id if there is more than one database registered to your CF space (doesn’t matter if it’s hana-cloud or hana-db) or if you want to deploy into a database that’s not the default for your space. If you only have exactly one database instance in your space, you don’t have to specify the database_id.











    hana-cloud hana

     name: my-service-hdi-container
    type: com.sap.xs.hdi-container
    parameters:
    shared: true
    config:
    database_id: database_id
    properties:
    hdi-container-service: ${service-name}


     name: my-service-hdi-container
    type: com.sap.xs.hdi-container
    parameters:
    shared: true
    properties:
    hdi-container-service: ${service-name}




With these simple details, it is simply now matter of doing:

mbt build
cf deploy mta_target/myapp.mtar


and voilà, your service is flying on SAP Cloud Platform, using HANA in the way you wanted.
4 Comments