Background

After I followed Stoyan Manchev’s excellent blog on 8 Easy Steps to Develop an XS application on the SAP HANA Cloud Platform I was really impressed and curious if Lumira could be used with the HANA calculation view created in the blog. Stoyan’s blog covers, from start to finish, the process to create an XS application and I did complete this; however with Lumira (the free version) I was interested in creating and visualising HANA views only. I will say up front that during the process I did experience a couple of failures using features of Lumira that I wanted to use. Although as a way to use HANA for free it was quite inspiring and I found myself addicted to the platform. Stoyan’s blog does come with a disclaimer that the features are in beta for the trial accounts; I did interpret this “beta” statement for the overall intention of his blog however things may be changing on the platform soon.

The HANA cloud platform (HCP) offers a free developer account here. Also to note I have not used HANA or the HCP in my day job and would advise double checking anything with documentation/SCN but I will be happy to comment on any aspect of my blog. So my intention is to go through the steps I went through to use Lumira with HANA as part of the HCP.  However I start with information on the dataset I used in my tests.

Crime dataset

The data set I chose is about crime and policing in England, Wales and Northern Ireland and downloaded from http://data.police.uk/data/ . The data is made available under the UK Open Government Licence . In the end I loaded over 13 million records from August 2011 to October 2013. As I say, I was not entirely successful with using all parts of Lumira as I wanted but if I was going to fail then why not fail with a large dataset and at the speed of HANA.

1 show13mill.png

Screen shot above of the Crime Count measure showing over 13 million records.

Connecting Lumira to HCP

At this point I will assume that the calculation view in Stoyen’s blog has been created. Also it does appear the SAPHCP @opensap course has used the same or similar XS application. Connecting Lumira to the HCP can be achieved by using the tunnel that comes as part of the HANA Client download. In the screen shot of my connection I will leave the connection details in plain view as the password does get reset at every logon.

Start the HCP tunnel

***edit 15.6.2015

Prompted by Venkadesh’s comment below I realied that Stoyan’s blog has been updated to connect via the cloud connection method, which means the HCP tunnel is no longer required for his blog. For this blog’s method to connect Lumira to the SAPHCP then the tunnel is still required.

So I would suggest either the help page.

Setting Up the Console Client

or the SAP HANA Academy document http://scn.sap.com/docs/DOC-62450

These cover the SAPHCP tunnel requirements.

Also from Stoyen’s blog in step 2 I would still add a “system” and not a “cloud system” as I would use an open tunnel’s details for hostname, instance number, user and password for the  connection. These details from a connected HCP tunnel are the ones I use for Lumira as well.

****end of edit 15.6.2015

2 tunnel.png

Start Lumira and select Acquire new dataset

3 new dataset.png

Select “Connect to SAP HANA One” – we won’t be connecting to HANA One but HANA running on our local laptop/computer 😉

Hit Next and complete the logon details

4 logon details.png

The connection details come from the tunnel output (in the screenshot below Start the HCP tunnel) and that the tunnel will be used to connect to the HCP.

Passwords will be reset every time the tunnel is open so it will have to be cut and pasted every time the tunnel is started. It is therefore essential to start the tunnel before connecting via Lumira.

Next locate your trial user account in the “Select a SAP Hana View” screen. I have scrolled down so my trial ID is at the top and as you can see there are a number of other users.

5 select hana view.png

In this example I will select SO_CV which I had created from following Stoyan’s blog.

6 select so_cv.png

Hit the create button.

Here in the below screenshot I have selected measures and attributes to create a chart.

  7 socv chart.png        

First Issue

The rank option in Lumira did not work

8 select rank.png

               9 query fails.png

Selecting this option breaks Lumira (my version is1.13) and thereafter none of the charts/features work so I always had to close the session. After simply acquiring the dataset again I was able to continue. I did a quick search and found a similar hit in note http://service.sap.com/sap/support/notes/1927144 . Not the exact same error, so probably not the same solution, however I did move on to try my own data. If the data selected was less than the standard Lumira 10,000 limit I could achieve the same ranking feature by manually sorting and then filtering the data.

Loading my own dataset into an HCP table

After downloading the crime data mentioned previously I used some unix commands (I run Cygwin on my Windows 7 laptop) to collate all the police force files into one bigger file for each month. I also run a script to check how wide the columns needed to be as this can provide some headaches on uploading files. As these scripts are specific for the data I will not list them here but happy to share these if required.

Preference settings

There are two settings worth changing or experimenting with in the Modeler view of the preferences. If set to 0 it will try to auto calculate “Batch Size” , however I have set that to 10000, probably 0 would be fine however I have left it at 10000 and will cover data load times in the next section. The “Decision Maker Count” is related to how the HANA Studio will calculate data types and column width for data loads. I set this to the maximum due to the fact I had, let’s say, a few load failures due to data not fitting into the column width. Also I created a script to calculate this setting for each file.

               10 preferences.png

Load data from a local file

From the HANA Studio menu –  file -> import

                  11 data from local file.png

I created a crime table to load the data into.

                     12 table settings.png

I found it always helps to load files with a header row, as the mapping of columns is then a lot easier. Loading the data always into my NEO_ schema.

The next screen is where the “Decision Maker Count” comes into play and how HANA studio presents the data mapped to data type.

Here is an original screen shot of one file,

                13 first showing width.png

On loading subsequent files, I chose an existing table.

                14 load existing table.png

Using the drop down and if the file has matching headers it is a simple case of “Map By Name” otherwise it is a manual process to link the columns.

                15 map by name.png

A screen shot of my final table settings.

                16 final showing width.png

Maybe not easy to tell but there are some big and not so big changes to the length.

I did find I was able to adjust the data type settings from the SQL Console if required. E.g.

     alter table “NEO_7YZMZC83MT9WX3OCFHA5HDEQX”.”crime” alter (“Location” NVARCHAR(58));

468,140 records uploaded in 818 seconds

                17 480000 records loaded.png


I did find that loading times did extend if I tried to load too many records at once.

Final table stats

The current runtime object of my table, over 640 mb and I loaded the files over a few weeks in my spare time.

                18 runtime object.png

Create an analytic view in HANA Studio

At this point I will state again this is my blog with the information based on self learning in regards to HANA view creation. Always better to check the information provided.

From the HANA Systems view I opened the hihanaxs package (created from Stoyen’s blog).

I created and Analytic view especially for this blog….

                19 create analytic view.png

Enter a name for the View

                   20 name view.png

Open the Catalog and NEO_xxx schema to locate the table crime and dragged the table to the “Data Foundation

                       21 drag crime.png

Then selected the columns I wanted for the view.

                       22 select columns.png

I selected Crime_type twice so eventually I could set up one as a measure and the other as an attribute.

After selecting Semantics I then de-selected the “Enable Analytic Privilege

                         23 semantics.png

Next auto assign the allocation of measures and attributes,

                          24 auto assign.png

I did change longitude and latitude to attributes

Also one of the Crime_type’s was renamed to crime_count and changed to a measure for Lumira to use.

                              25 rename crimecount.png

I then hit the green arrow to save and activate my view. (you may need to refresh the view in HANA Studio to pick this up)

Even though the view is active Lumira would not be able pick up the new view, some settings still need to be made.

                                 26 auth error.png

Authorisations for the view

At this point I will state again, this is my blog with the information based on self learning in regards to HANA authorisations. Always better to check the information provided

Alter model access file

From information from Stoyen’s blog I added my new views to the model_access.hdbrole  as per the following. I did originally make some errors following the blog and placed some views in the base package hihanaxs and altered the statements as below.

role p1248461150trial.hihanaxs.hihanaxs.roles::model_access

{ sql object p1248461150trial.hihanaxs:CANA1.analyticview : SELECT ;

       sql object p1248461150trial.hihanaxs:TEST.analyticview : SELECT ;

       sql object p1248461150trial.hihanaxs:MULT1.analyticview : SELECT;

       sql object p1248461150trial.hihanaxs:CRIMEBLOG.analyticview : SELECT;

       sql object p1248461150trial.hihanaxs:SO_CV.calculationview : SELECT;

       sql object p1248461150trial.hihanaxs:CVIEW.calculationview : SELECT;

       sql object p1248461150trial.hihanaxs.hihanaxs:CVIEW.attributeview : SELECT;

       sql object p1248461150trial.hihanaxs.hihanaxs:JOIN.attributeview : SELECT;

}

Then I started SQL Console and ran the following SQL to activate the new authorisations

.

call “HCP”.”HCP_GRANT_ROLE_TO_USER”(‘p1248461150trial.hihanaxs.hihanaxs.roles::model_access’,’p1248461150′);

CALL “HCP”.”HCP_GRANT_SELECT_ON_ACTIVATED_OBJECTS”

I could then connect successfully.

         

                                            27 chart connected success.png

Second Issue

The main feature I wanted to use in Lumira with the dataset was the Geo charts. I was unable to proceed as the location details were unknown to Lumira and for some reason the Latitude and Longitude remained greyed out. No matter what setting measure or attribute (or even changing decimal places) I could not get this feature to work. Maybe it’s not clear in the screenshot below but the “Create a geographic hierarchy” the “By Names” selection is the only one available.

                          28 geo fails.png

Simple analysis of the dataset

The main intention of my blog is to suggest there is a way to use the free copy of Lumira with HANA via the free trial account of the HCP. However as “Anti-social behaviour”  is the top crime I thought I would drill down into this crime.

Seasonal Anti Social behaviour

There does appear to be peaks and troughs through the seasons.

                     29 seasonal anti social.png

I was intending to name the place with the highest Anti-Social crime figures in the country over the time period in my dataset. However as I type I do not particularly want to do that here on my blog. Although I do now have a dataset that I am interested in and can explore more of Lumira and the SAP HCP. A dataset that I will use for future adventures into the HCP & Lumira world

.

To report this post you need to login first.

9 Comments

You must be Logged on to comment or reply to a post.

  1. Tammy Powlas

    Nicely done and thank you for the step-by-step about what worked and what didn’t work.

    I notice you didn’t post this at 13:00 Wednesday – hopefully the views will still be as good!

    (0) 
    1. Robert Russell Post author

      Hi Tammy,

      Thank you, lets see about the views. Maybe an idea for me to make a follow up analysis to see how 22:30 (my time) on a Monday works out 🙂

      Cheers

      Robert

      (0) 
  2. Irshaad Bijan Adatia

    It’s great to see a HANA connection being shown step-by-step. I’m working on a small project also from open.sap courses. Hopefully some sharing will come soon as well~~

    Thanks for the great work!! Part Two coming soon? 🙂 🙂

    Bijan

    (0) 
    1. Robert Russell Post author

      Hi Bijan,

      Thanks for taking time to comment as for my part two, its taking shape but not soon.

      I look forward to you posting your project soon 🙂

      Cheers

      Robert

      (0) 
  3. Suseelan Hari

    Hi Robert,

    Fantastic Work! I impressed!

    Keep up the good work!

    I like the way you have documented, generated and presented.

    Thanks for sharing step by step procedure related to SAP Lumira.

    Have a nice day!

    Regards,

    Hari Suseelan

    (0) 
  4. Venkadesh S

    Hi Robert Russell,

    Thanks for sharing good document.

    Please explain more about ” Connecting Lumira to HCP” I dont know how you created

    Hostname, instance number, user and password in tunnel?. Please give more information for my understanding.

    Thanks,
    Venkat

    (0) 
    1. Robert Russell Post author

      Hi Venkadesh,

      I was going to point you at Stoyan’s blog which I mention at the start of my blog, however I just re-read it to check it contained the right information.  I can see that he has updated it to connect via the cloud connection method which means the HCP tunnel is no longer required for his blog.

      For this blog’s method to connect Lumira to the SAPHCP the tunnel is still required.

      So I would suggest either the help page.

      Setting Up the Console Client

      or the SAP HANA Academy document http://scn.sap.com/docs/DOC-62450

      These cover the SAPHCP tunnel requirements.

      Also from Stoyen’s blog in step 2 I would still add a “system” and not a “cloud system” as I would use an open tunnel’s details for hostname, instance number, user and password for the  connection. These details from a connected HCP tunnel are the ones I use for Lumira as well.

      Hope that helps and thanks for the comment as I will add a section just above the tunnel in this blog.

      Regards

      Robert

      (0) 

Leave a Reply