Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
JonGooding
Advisor
Advisor
In this write-up, I'll be covering a scenario close to home for me, with also including some of the new SAP Business Technology Platform Features. If it was a point to point run, it would be going from the town of CSV, stopping in HANA Cloud for some persistence, then getting dressed in the Business Application Studio and doing the final showcase in the SAP Analytics Cloud.


For those who know me, I like my running, Living in Brisbane, Australia over the recent months has been the usual summer heat, and with that comes the usual management of hydration and refueling for the long runs. Not wanting to carry excessive water in my backpack - I sometimes plan my runs to include a path where the taps are - so I can use the tap, instead of having the water stored in my pack.

So with some data sets that are available to me,

Brisbane City Council open data on Public Taps

Garmin / Strava tracking of my runs, I can overlap and get some novel runs in, while being close to water stops when required.

I'll also use the latest SAP Cloud Technologies;

  • SAP HANA Cloud - to store the data and perform spatial and graph networking

  • SAP Business Application Studio - to model and prepare the data for consumption

  • SAP Analytics Cloud - to visualise the data


As part of the preparation, I'll also use;

  • DBeaver - Open Source Loading of data tool


Get the Data into SAP HANA Cloud



  1. Using the Brisbane City Council open data, download the CSV to the local machine

  2. Connect DBeaver to the HANA Cloud instance. There is lots of blogs on how to do this.

  3. Import the dataset into HANA Cloud instance:



4. With the table created as PUBLIC_DRINKING, and having latitude and longitude only, needing the spatial point on the map. Creating the view allows the base table to be updated and the view will always convert it to a ST_POINT:
CREATE VIEW "AU_QLD_BCC_PUBLIC_DRINKING_GEO"
AS SELECT *, NEW ST_POINT(X,Y).ST_SRID( 4326).ST_TRANSFORM( 3857) "ST_POINT_3857",
NEW ST_POINT(X,Y).ST_SRID( 4326).ST_TRANSFORM(1000004326) "ST_POINT_4236"
FROM PUBLIC_DRINKING;

This will provide points on the maps. I have created both SRID for 3857 and 4236 as a test of some scenarios later.

Viewing the taps on a map quickly in DBeaver shows the taps in Brisbane City Council:


All public taps in Brisbane - 1695 of them


Viewing Mt Coot-tha in particular for taps:


Mt Coot-tha Taps


Now working out where the closest tap, depending on where you are, can be found by using a Voronoi Dragram:

 


Mt Coot-tha Closest Tap Map


Adding this query to the view:
DROP VIEW "AU_QLD_BCC_PUBLIC_DRINKING_GEO";
CREATE VIEW "AU_QLD_BCC_PUBLIC_DRINKING_GEO"
AS SELECT *, NEW ST_POINT(X,Y).ST_SRID( 4326).ST_TRANSFORM( 3857) "ST_POINT_3857",
NEW ST_POINT(X,Y).ST_SRID( 4326).ST_TRANSFORM(1000004326) "ST_POINT_4236",
ST_VoronoiCell(ST_POINT_3857, 5.0) OVER () AS CATCHMENT_AREA
FROM PUBLIC_DRINKING;

 

So now loading a Garmin / Strava run over this. I have a few methods to do this, but I'll cover this in my next blog. Essentially using python to upload the gpx file directly to the HANA Cloud.

So if I pick one of the runs:


Garmin Tracked Run


 

Now, I just need to get the locations from the run to the nearest tap.

So to acquire the nearest tap from each point, a simple query (This is using the Voronoi Dragram), but ideally I would also load the tracks and use that, as a tap - although close may be difficult to cross over hills etc:

So viewing the nearest taps for the run:


Taps near run


Quite a few of the taps outside the parameter of the run, I didn't know existed - so always a good option to know if you get short of water.

From a Council perspective, these are the taps that I would be using (for maintenance etc):
SELECT BCC.PARK_NAME, BCC.ITEM_DESCRIPTION, count(BCC.PARK_NAME) FROM 
AU_QLD_BCC_PUBLIC_DRINKING_CLOSEST_GEO BCC,
ACTIVITY_DETAILS_GEO ACT
WHERE ACT.ST_POINT_3857.ST_WITHIN(CATCHMENT_AREA) =1
AND ACT.ACTIVITY_ID = '955636808'
GROUP BY BCC.PARK_NAME, BCC.ITEM_DESCRIPTION


There is alot more queries and analysis that could be done with Spatial! But I'll save that for another day.

Modelling with SAP Business Application Studio


Now, having a better understanding of the data, I will use the Business Application Studio to make the data available to other consumption tools for visualisation.

Business Application Studio provides a much easier Modelling capability, below is the fundamental steps to get the data from the HANA Cloud into the Business Application Studio:

  1. Provision the SAP Business Application Studio

  2. Create a DEV Space - HANA Based Application

  3. Create a HANA Project

  4. Deploy the initial Container using the highlighted rocket icon




  5. Next we will be adding the HANA schema to the project, but first we need to add the privileges to the User Provided Service on the underlying objects


    1. -- Creating a role for simplicity and adding the required objects to the role
      -- The section below is for the object owner
      create role SELECT_ON_ACTIVITIES_WITH_GRANT;
      grant select on ACTIVITIES to SELECT_ON_ACTIVITIES_WITH_GRANT with grant option;
      grant select on ACTIVITY_DETAILS to SELECT_ON_ACTIVITIES_WITH_GRANT with grant option;
      grant select on PUBLIC_DRINKING to SELECT_ON_ACTIVITIES_WITH_GRANT with grant option;
      grant select on ACTIVITY_DETAILS to SELECT_ON_ACTIVITIES_WITH_GRANT with grant option;
      grant select on AU_QLD_BCC_PUBLIC_DRINKING_GEO to SELECT_ON_ACTIVITIES_WITH_GRANT with grant option;
      grant select on AU_QLD_BCC_PUBLIC_DRINKING_CLOSEST_GEO to SELECT_ON_ACTIVITIES_WITH_GRANT with grant option;

      grant SELECT_ON_ACTIVITIES_WITH_GRANT to "_SYS_DI#BROKER_CG"._SYS_DI_OO_DEFAULTS;

      -- The section below is for the Aplication User
      create role SELECT_ON_ACTIVITIES;
      grant select on ACTIVITIES to SELECT_ON_ACTIVITIES;
      grant select on ACTIVITY_DETAILS to SELECT_ON_ACTIVITIES;
      grant select on PUBLIC_DRINKING to SELECT_ON_ACTIVITIES;
      grant select on ACTIVITY_DETAILS to SELECT_ON_ACTIVITIES;
      grant select on AU_QLD_BCC_PUBLIC_DRINKING_GEO to SELECT_ON_ACTIVITIES;
      grant select on AU_QLD_BCC_PUBLIC_DRINKING_CLOSEST_GEO to SELECT_ON_ACTIVITIES;
      grant SELECT_ON_ACTIVITIES to BROKER_USER.RT_DEFAULTS;

      create role "AAforOO";
      grant SELECT_ON_ACTIVITIES_WITH_GRANT to "AAforOO";

      create role "AAforAP";
      grant SELECT_ON_ACTIVITIES to "AAforAP";

      -- Now create the technical User Provided Service User and grant the privs.
      create user ACTIVITY_UPS password "SAPWelcome123" NO FORCE_FIRST_PASSWORD_CHANGE set usergroup default;
      grant "AAforAP","AAforOO" to ACTIVITY_UPS WITH ADMIN OPTION




  6. Next, add the HANA Schema to the Project, by "Adding a Database Connection" (highlighted) or by adding an existing UPS to the project directly.  Use the user that has just been created as part of the script above.




  7. Deploy the project with the new cross container service added.

  8. Linking the containers objects with the technical users objects, add a .hdbgrants file in the src directory:


    1. {
      "activity_analytics_ups": {
      "object_owner": {
      "roles": [
      "AAforOO"
      ]
      },
      "application_user": {
      "roles": [
      "AAforAP"
      ]
      }
      }
      }




  9.  So the following should match:




  10. So that's the configuration out of the way


Now I have the objects required in the Business Activity Studio, I can do some modelling and analysis for the data sets I have brought in.

So creating the Calculation Views for the Running Activities:

  1. Using the "Create HANA Database Artifact" wizard:




  2. Then viewing the Calculation View in the Graphical Editor, and adding a join and looking up the required tables. These objects are coming from the External Service created earlier.

  3. Then selecting the fields that are appropriate as Measures. Leaving the Latitude and Longitude as a double type.

  4. This view should not contain any Geospatial ST_POINT columns.

  5. Deploy the Calculation View - this should give something similar:


    1. Deploying to the container "ACTIVITY_ANALYTICS_HDI_DB_1"... ok (0s 974ms)
      No default-access-role handling needed; global role "ACTIVITY_ANALYTICS_HDI_DB_1::access_role" will not be adapted
      Unlocking the container "ACTIVITY_ANALYTICS_HDI_DB_1"...
      Unlocking the container "ACTIVITY_ANALYTICS_HDI_DB_1"... ok (0s 0ms)
      Deployment to container ACTIVITY_ANALYTICS_HDI_DB_1 done [Deployment ID: none].
      Deployment ended at 2021-03-26 03:53:40
      (2s 484ms)




  6. For simplicity, next I will create the GEO Location View, using the same process to display the ST_POINTS live from HANA Cloud.

    1. Repeat the process for creating the Calculation View above, with the following changes

    2. Put the Calculation View in the SAP_BOC_SPATIAL folder, this will allow the SAC Model to use these Location points

    3. Create the Calculation View as a DIMENSION - Not a Cube

    4. Only bring across the ACTIVITY_ID (Key) and the ST_POINT fields (remember I created the view above adding the ST_POINT with the SRID 3587

    5. Make sure the column names in the GEO view don't have the same column names as the Cube View.

    6. Deploy the DIMENSION Calculation View for the GEO Calc View with the ST_POINT reference.



  7. Finally you will need to grant the HDI container to the users that will be connecting to SAC (this only needs to be done once per container):


    1. GRANT "ACTIVITY_ANALYTICS_HDI_DB_1::access_role" TO DBUSER;




  8. That's it. 2 Views, will be required to create the Location Based model in SAC.


SAP Analytics Cloud Configuration



  1. In SAP Analytics Cloud (SAC), create the Connection to the HANA Cloud instance using the user (in Step 7 above) that you granted access to the container. All you need is the Host Server location - no port number and the username / password

  2. Create the SAC model based on the Live HANA Connection, and then under Location Dimension - Add the Dimension Geo View created earlier;


    1. Once this model is saved, then create a story and add the geospatial layers as required.



  3. Now the model is created, create your story and associated analytics. I have not put any effort on making the visualisation look good, I am all about the data access at the moment. A basic story, with the taps and the start locations of some of my activities:


  4. There is a lot more you can do in SAC, which I won't cover here. But what I have done next is linked the maps and the 2 basic charts. Then catering for walking the dog on walks around Mt Coot-tha - I can drill filter on Mt Coot-tha, then filter on taps that have dog bowls - so I know where I can go on a dog walk with without running out of water.


  5. And easily overlay the locations where I have started my activities in the past:




Now I have a framework to be able to load additional data sets in easily and perform spatial analytics with a rich front end visualisation. Although I haven't built a comprehensive experiance it deserves - there are plenty of blogs out there that do.

I have provided screen shots of most of the important bits. If you need help in doing this, I can elaborate on it if you have any issues.

Happy running!