Skip to Content
Technical Articles
Author's profile photo Mathias Kemeter

The Small JSON Document Store Cookbook

Have you ever worked with native JSON documents in SAP HANA Cloud? If the answer is ‘No’, you can find some resources to get started here:


However, if you have worked with JSON documents before, you may have stumbled upon a few questions, which I will try to cover in this blog entry. So, consider this an FAQ, cookbook, cheatsheet or simply a collection of best practices. In any case, I will not try to explain you the (JSON) world, but will try to focus on specific aspects.

Let me know if you found this helpful or have suggestions for enhancements!


Interacting with Collections

Instead of storing data in tables, HANA’s Document Store uses so-called collections. Collections support similar CRUD operations as tables.

-- creating a collection

-- insert record using the Object Expression interface
        "name": 'Paul',
        "address": {
            "street": 'Hauptstraße 10',
            "city": 'Heidelberg'

-- insert record using the string interface
-- use this if you have a JSON string already in your application; notice the different quotes
        "name": "Paul",
        "address": {
            "street": "Hauptstraße 10",
            "city": "Heidelberg"

-- Query with filter
SELECT * FROM Customers WHERE "address"."city" = 'Heidelberg';

-- Query with projection and aggregation
SELECT "address"."city", COUNT(*) FROM Customers GROUP BY "address"."city";


Object Expressions vs. String Interface

As you may have seen above, there are two insert statements: one is using the so-called ‘object expression’ whereas the other is using the ‘string interface’ for JSON documents.

Note that for one statement we simply pass a string for inserting a JSON document:

INSERT INTO mycollection VALUES ('<some json doc in form of a single-quoted string>')

For the other variant, we pass an object, that is properly embedded into HANA’s SQL interface:

INSERT INTO mycollection VALUES ({"<some json-like object, that is understood by the SQL layer>"})

The latter is not a JSON document in the strict sense, because the JSON standard foresees using double quotes to identify strings. However, in HANA’s SQL interface strings are identified by single quotes, which is why a JSON object expression uses slightly different quotes:

INSERT INTO mycollection VALUES ({"name": 'henry'})

If you are ever struggling with quotes, it’s probably related to the fact that you are trying to use the wrong interface. When you have a proper JSON document, you should always use the string interface and pass the document as a single-quoted string.

Why do we need the object expression representation anyway, if it’s not even proper JSON? Thanks for the question! This is actually a feature – and not meant to bully you with complex stuff. The object expression is required to seamlessly integrate JSON document handling into the SQL interface. You can do cool stuff like creating documents in projection without using cumbersome string concatenations:

SELECT {"surname": "name", "justthestreet": "address"."street"} FROM Customers

That’s neat!


Joins between Tables and Collections

SAP HANA is a true multi-model database! It’s combines the best of the relational and NoSQL world into one unified data management platform. Well, that means you should be able to join JSON collections with relational tables?

Yes, you can definitely do this! To do that, you first need to get to a common data model for joining your collection with your table. Technically, this means that you have to create an SQL view on top of your collection and transfer the required attributes into the relational model. On SAP HANA, you can define views implicitly at query time using the WITH clause.

Here is how a join would look like:

CREATE COLUMN TABLE SomeColumnStoreTable ("name" nvarchar(20), "age" int);

INSERT INTO SomeColumnStoreTable VALUES('Paul', 34);

WITH myView AS (SELECT "name", "address"."city" FROM Customers)  -- implicit view!
SELECT * FROM myView INNER JOIN SomeColumnStoreTable AS cst ON myView."name" = cst."name";

Please note that depending on your join criteria, you may have to apply datatype casting in the implicit view definition. Values returned by the JSON Document Store are strings per default!


Joins between Collections

A JSON collection can be joined with a JSON collection by using a similar construct as above:

create collection clubs;
create collection players;

insert into clubs values({"clubName": 'Heidelberg', "league": 'X'});
insert into players values({"playerName": 'Paul', "club": 'Heidelberg'});

with c as (select "clubName", "league" from clubs),
     p as (select "playerName" , "club" from players)
select * from c inner join p
     on c."clubName" = p."club";


Using Parameters in WHERE-Conditions

Yes, you can use parameters in WHERE conditions on the Document Store! There is just a tiny thing with regards to parameter typing, that you have to consider as opposed the similar query based on a table. Please find an example using function BIND_AS_VALUE here:

INSERT INTO mycol VALUES('{"a":1, "b":1}');
INSERT INTO mycol VALUES('{"a":1, "b":2}');
INSERT INTO mycol VALUES('{"a":2, "b":1}');
INSERT INTO mycol VALUES('{"a":2, "b":2}');
INSERT INTO mycol VALUES('{"a":3, "b":1}');
INSERT INTO mycol VALUES('{"a":3, "b":2}');

	DECLARE v_int int;
	select max("a") into v_int from mycol;
	select * from mycol where "a" = bind_as_value(:v_int);


Python Integration

The latest version of the hana-ml Python Client comes with enhanced support for multi-model data (i.e. JSON, graph & spatial). When working with SAP HANA Cloud in Python it is recommendable to make use of the built-in dataframe concept, which enables you to work with data residing in HANA without the need to transfer it to the (Python) client.

Specifically, for HANA’s Document Store there is a package called hana_ml.docstore that enables you to easily upload datasets to collections. The underlying mechanism uses efficient bulk inserts to enable large data volumes.

from hana_ml.docstore import create_collection_from_elements
with open("./path/to/document.json") as json_file:
    data = json.load(json_file)


Import/Export with Cloud Bucket

You can export JSON collection to the file system and import the same format again using the following statements:


IMPORT FROM JSON FILE 's3-<AWS S3 region>://<AWS S3 Access Key ID>:<AWS S3 Secret Access Key>@<AWS S3 bucket>/<aws-object-id>' 

EXPORT INTO JSON FILE 's3-<AWS S3 region>://<AWS S3 Access Key ID>:<AWS S3 Secret Access Key>@<AWS S3 bucket>/<aws-object-id>' 


Bulk Inserts

HANA’s JSON Document Store supports bulk inserts. Let me give you two examples for Java and Python.

Bulk Inserts with Java

Get the latest ngdbc driver (e.g. via maven) and connect to the database. Then create a collection and insert documents by using `autoclosable` connection and prepared statement.

try(final Connection connection = ds.getConnection(); final Statement pstmt = connection.prepareStatement("INSERT INTO collectionName VALUES(?)")) {
    pstmt.setString(1, "{"productid": 6060, "price": 735.8228858436846}");
    pstmt.setString(1, "{"productid": 7090, "price": 735.8228858436846}");
    pstmt.setString(1, "{"productid": 9087, "price": 735.8228858436846}");

Bulk Inserts with Python

For inserting JSON documents in Python it is recommended to use the hana-ml Python Client, which delivers this functionality out of the box. However, you can also take the native approach:

data = []
data.append(['{"key": 15, "key2": 100}'])
data.append(['{"key": 15, "key2": 101}'])
data.append(['{"key": 15, "key2": 102}'])
cursor.executemany("""insert into MyCollection values(?)""", data)


Document IDs

Unlike other document stores, HANA’s Document Store does not automatically assign id fields to newly inserted documents. If you require such a field, you can generate it with a sequence in HANA.

INSERT INTO c1 VALUES('{"KEY":' || s1.nextval || '}');


This blog is based on an SAP-internal write-up by Christian Bensberg, which many colleagues found helpful when interacting with JSON documents. Thanks, Christian!

Assigned tags

      1 Comment
      You must be Logged on to comment or reply to a post.
      Author's profile photo Uwe Fetzer
      Uwe Fetzer

      Thanks Mathias, very helpful (especially the joins with regular tables). Hope you will update this blog post whenever you find some new interesting functions.

      Cheers, Uwe