Technical Articles
Database migration in SAP Cloud Platform -Liquibase is the way
This blog is about database changes and it’s tracking mechanism in the cloud world, where we have releases every 2-weeks or even every week, so to tackle any database changes to migrate the database changes for any web based application, especially for JPA based application where database content gets generated through eclipselink API. So, to maintain about the versioning, tracking the database artifacts changes. We need to have some database migration or refactoring tools and Liquibase is the answer. It helps for tracking, managing and applying schema changes, modifying the data in database etc. In cloud platform where we development cycle is less, so it is very important to track the database changes and liquibase is of immense help. It helps to track the changes happening at the database level.
In next few steps, I am going to explain about setting up a eclipselink i.e. JPA based web application with liquibase as database migration tool but before that, a quick look about liquibase.
1. Introduction to Liquibase
Liquibase is an open source library for tracking, managing and applying database changes that can be used for any database with a JDBC driver. It is built on a simple premise: all database changes are stored in a human readable yet trackable form and checked into source control. Developed by Datical and released under Apache 2.0 license.
2. How does it work?
When Liquibase is executed, you must specify the database against which to apply your changesets. Liquibase uses two tables to manage changes to the database: databasechangelog and databasechangelock. If the tables don’t exist on the target database, they are created. An entry is added to the databasechangelock table which ensures only one instance of Liquibase is running at a time. The databasechangelog table contains a listing of every changeset that has been applied to this database. Liquibase conducts a diff of the table contents with the XML files and determines which changes still need to be applied. Once this is determined, Liquibase will then apply the changes to the database.
3. For more details refer below links
http://www.liquibase.org/index.html
https://en.wikipedia.org/wiki/Liquibase
4. Prerequisite Activities
a) Create and build JPA Project: Refer section 2.1
b) Get dependent Library: Refer section 2.2: Along with the mentioned library in my previous blog as linked in the mentioned section, needs to add the below dependency for liquibase implementations.
i. For HANA Database dependency
<dependency>
<groupId>com.sap.db.jdbc</groupId>
<artifactId>ngdbc</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>org.liquibase.ext</groupId>
<artifactId>liquibase-hana</artifactId>
<version>3.4.1-sap-02</version>
</dependency>
ii. For ASE (i.e. Sybase database)
<dependency>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-core</artifactId>
<version>3.2.3version>
</dependency>
In my case I have tried for both the database and it worked without even a code changes, this is one of the main advantage of using liquibase, if we want to migrate it to any database, just we need to change the database dependant jar and rest of the things will be taken care.
5. Configuration and Implementation
a. Configuration of persistence.xml in JPA model
In JPA based application, we need to add persistence.xml but that can be created automatically through eclipse IDE. We just have to add the following content in persistence.xml file in file location à <ProjectName>/Java Resources/src/META-IN
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0"
xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="liquibasePerist" transaction-type="RESOURCE_LOCAL">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<class>com.sap.sample.model.Employee</class>
<class>com.sap.sample.model.Orders</class>
<shared-cache-mode>NONE</shared-cache-mode>
<properties>
<property name="javax.persistence.validation.mode" value="NONE" />
<property name="eclipselink.ddl-generation" value="none" />
<property name="eclipselink.logging.level" value="FINEST" />
<property name="eclipselink.ddl-generation.output-mode" value="database"/>
<property name="eclipselink.jdbc.allow-native-sql-queries" value="true" />
</properties>
</persistence-unit>
</persistence>
As mentioned in the above content transaction type should be of RESOURCE_LOCAL. Along with that we need to set the property eclipselink.ddl-generation value to none, Since database contents are generated via liquibase while deploying the database artifacts not via ecliplelink generation mechanism.
One more thing we need to set eclipselink.ddl-generation.output-mode value to database as the database content is getting generated in database container and the information related to DDL or DML execution stored into the databasechangelog as this table creates in the beginning of the application deployment.
6. Liquibase related configuration for database: To configure application for liquibase, I have followed the approach of creating multiple db-changelog files. So that we can segregate the datatype properties along with the database creation properties as mentioned in the below screenshots
db-changelog files includes all the related files which quite comes handy when we want to include file for every release, then db changes will be different files for every releases as shown below
<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:ext="http://www.liquibase.org/xml/ns/dbchangelog-ext"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.1.xsd
http://www.liquibase.org/xml/ns/dbchangelog-ext http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-ext.xsd">
<!-- Common definitions; e.g. field types -->
<include file="db/properties/db-changelog-properties.xml" />
<!-- REL-1.0 -->
<include file="db/changelog/changelog-V1.xml" />
</databaseChangeLog>
db-changelog-properties.xml will maintain various datatypes, which can be referenced in multiple places basically based on the version where data types are being used as shown in below
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.1.xsd">
<!-- Portable database type definitions -->
<property name="type.bigint" value="BIGINT" dbms="all,derby,hanadb" />
<property name="type.int" value="INT" dbms="all,derby,hanadb" />
<property name="type.smallint" value="SMALLINT" dbms="all,derby,hanadb"/>
<property name="type.double" value="DOUBLE" dbms="all,derby,hanadb" />
<property name="type.timestamp" value="TIMESTAMP" dbms="all,derby,hanadb" />
<property name="type.boolean" value="BOOLEAN" dbms="all,derby,hanadb" />
<property name="type.unicode120" value="NVARCHAR(120)" dbms="hanadb" />
<property name="type.unicode120" value="VARCHAR(120)" dbms="all,derby" />
<property name="type.unicode200" value="NVARCHAR(200)" dbms="hanadb" />
<property name="type.unicode200" value="VARCHAR(200)" dbms="all,derby" />
<property name="type.unicode255" value="NVARCHAR(255)" dbms="hanadb" />
<property name="type.unicode255" value="VARCHAR(255)" dbms="all,derby" />
<property name="type.unicode2k" value="NVARCHAR(2000)" dbms="hanadb" />
<property name="type.unicode2k" value="VARCHAR(2000)" dbms="all,derby" />
</databaseChangeLog>
db-change-V1.xml files contain about table creation, creating new column, modifying any datatype of column, sql check, basically the table generation information has to be maintained hear, a shown below
<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ext="http://www.liquibase.org/xml/ns/dbchangelog-ext"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.1.xsd
http://www.liquibase.org/xml/ns/dbchangelog-ext http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-ext.xsd">
<changeSet author="I070883" id="Employee">
<preConditions onFail="MARK_RAN">
<not>
<tableExists tableName="EMPLOYEE" />
</not>
</preConditions>
<comment>Employee table creation (with PK)</comment>
<createTable tableName="EMPLOYEE">
<column name="EMP_ID" type="${type.unicode120}">
<constraints nullable="false" />
</column>
<column name="EMP_NAME" type="${type.unicode200}">
<constraints nullable="false" />
</column>
<column name="EMP_ADDRESS" type="${type.unicode200}">
<constraints nullable="false" />
</column>
<column name="DESCRIPTION" type="${type.unicode255}" />
<column name="PHONE" type="${type.unicode200}">
<constraints nullable="false" />
</column>
</createTable>
<addPrimaryKey columnNames=" EMP_ID " constraintName="EMPLOYEE_ID"
tableName="EMPLOYEE" />
</changeSet>
<changeSet author="I070883" id="Orders">
<preConditions onFail="MARK_RAN">
<not>
<tableExists tableName="ORDERS" />
</not>
</preConditions>
<comment>Orders table creation (with PK)</comment>
<createTable tableName="ORDERS">
<column name="TRANSACTION_ID" type="${type.unicode120}">
<constraints nullable="false" />
</column>
<column name="ORDER_ID" type="${type.unicode120}">
<constraints nullable="false" />
</column>
<column name="COMMENTS" type="${type.unicode200}">
<constraints nullable="false" />
</column>
</createTable>
<addPrimaryKey columnNames="TRANSACTION_ID"
constraintName="TRANSACTION_ID" tableName="ORDERS" />
</changeSet>
<!-- Add new Columns in Orders table and in precondition need to check if
column exists -->
<changeSet author="I070883" id="addNewColumnOrder">
<preConditions onFail="MARK_RAN">
<not>
<columnExists columnName="QUANTITY" tableName="ORDERS" />
</not>
</preConditions>
<comment>Orders table create new column in order table</comment>
<addColumn tableName="ORDERS">
<column name="QUANTITY" type="${type.unicode255}" />
</addColumn>
</changeSet>
<!-- Modify existing column quantity from nvarchar to bigint -->
<changeSet author="I070883" id="OrdersTableModifyColumn">
<preConditions onFail="CONTINUE">
<!-- <tableIsEmpty/>
<not>
<columnExists columnName="QUANTITY" tableName="ORDERS" />
</not> -->
</preConditions>
<modifyDataType columnName="QUANTITY" newDataType="${type.int}"
tableName="ORDERS" />
</changeSet>
</databaseChangeLog>
As mentioned above, I have maintained three files to segregate the database changes and it’s properties. However, it can be clubbed together and can be placed under same file. For any database change log syntax as shown above in the files, three properties are mandatory in order to execute the changeset author, id is mandatory to track the change sets.
6 a. Configuration of web descriptor file (web.xml): For complete web.xml configuration refer my previous blogs section 3.3, Along with that we have to include the following properties in order to invoke the database creation through liquibase as shown below
<context-param>
<param-name>liquibase.changelog</param-name>
<param-value>db/changelog/db-changelog.xml</param-value>
</context-param>
<context-param>
<param-name>liquibase.datasource</param-name>
<param-value>java:comp/env/jdbc/DefaultDB</param-value>
</context-param>
<listener>
<listener-class>liquibase.integration.servlet.LiquibaseServletListener</listener-class>
</listener>
As mentioned in above, we need to define the changelog files, where we have maintained all the database change logs. Along with the that we need to define the data source also. Which can be read while executing the defined listener class. The liquibase listener class as mentioned in the web descriptor xml, reads the params as defined in the param fields.
7. Creating a JPA Processor for OData request: For setting up of JPA processor based OData request and creating a OData based entity class is mentioned in my previous blog section 5.
8. Conclusion: The intent of this blog is summarize that how we can have the database change management can be configured with the odata based JPA application. As in multi developer environment, it is easier to collaborate and track the database changes, Like if we run any queries will not be able to track the users who has made the changes, But here with every change set we can track who is making what changes and that helps to tracks all the database changes. It also manages Database Schema changes together with application code in a programmatic fashion that eliminates errors and delays and enables rapid Agile releases. You can find the sample application with liquibase approach as mentioned above in github.
Few of my other blog post refer below the links.
- Blog Series:ODATA Protocol implementation for IOT Applications on SAP HCP
- Binary content upload and download from HANA DB through Apache Olingo OData in SAP CF
- CDS extension feature in HANA
- Logging in HCP Cloud foundry with Java and Tomee using slf4j, logback, Jolokia
- Image persistence through Spring RESTful web services in CF
- Json to parquet, orc file format using Apache spark library
Hi Rajnish,
Kudos for the write-up. I had a few questions and perhaps you can help. As you know already, in the change log we can only use "createTable" or alternatively, an sql statement to "Create table...". In HANA however, it is possible to create column-tables as well. What is the best way to mimic this in liquibase? Do you have some hacks for the same ? The inability to create column-tables in liquibase is a real blocker for our project for now. Will be great if you can suggest some work-around.
Thanks a lot,
Naurin Jamil
Hi Naurin,
Thanks !!
I am not sure about column based table creation. But i can give it a try and will update you.
Regards,
Rajnish
Hi Rajnish,
any chance you are working on your own fork for liquibase-hana?
<dependency>
<groupId>org.liquibase.ext</groupId>
<artifactId>liquibase-hana</artifactId>
<version>3.4.1-sap-02</version>
</dependency>
as i cannot find this version in public and the only version i see for liquibase-hana is pretty old and not vary well maintained.
thanks!
Hi Chen,
The version mentioned above is internal to SAP however the version which is released under apache 2.0 licence
Maven dependency
Hi Rajnish,
as you described liquibase is great tool for database migration. I personally used it several times to make the schema definition and migration database agnostic for java applications. When building a simple app using the HANA database service for the CF environment in the SAP Cloud Platform my first idea was to replace the CDS artifacts with a module using the liquibase approach. Unfortunately the database user provided by the HDI container has no permissions on any schema objects. For local development I installed HANA Database Express Edition (server only) on my laptop and used the SYSTEM user to create application users with appropriate permissions. This works fine with liquibase. But as mentioned before this approach cannot be transferred to the CF due to the missing database user rights. Any suggestions?
Thanks,
Stefan
Hi Stefan,
The above example is developed and tested in Cloud Neo platform. However I can try it out the same in CF also.
Thanks,
Rajnish