Aggregations of stores selected in a map

The purpose of this project was to test the geographic capabilities of SAP HANA, the goal was to get information of sales grouped by their geographic location with a familiar interface like Google Maps, the project is available in GitHub: christianladron/Geo_Aggs-Map.


The motivation of the development was to create a way to get information of sales by region independent of the previously defined on the system, allowing to monitor the sales of a set of stores contained in an arbitrary region of the map, in this case defined with a polygon, the which can be defined in a familiar Google Maps interface clicking on the locations of its vertices.

Definition of the geographic selection:

Screenshot from 2014-04-30 17:46:47.png

Screenshot from 2014-04-30 17:46:59.png

Screenshot from 2014-04-30 17:47:16.png

Result:

Screenshot from 2014-04-30 17:47:21.png


This was solved using the geographic objects of HANA and its methods to generate the content, an XS service to response the results involving geographic queries and SAP UI5 to create the components of the interface, the geographic objects of HANA can be exported to the WKT standard, incompatible with Google maps, so we used the Wicket library (https://github.com/arthur-e/Wicket) for the translation between these formats.


First we created the tables of Sales, Stores and Categories filled with random data, the table stores has the fields Latitude and Longitude of each store, the table Sales has all the data records of each sale, including the store, and it’s corresponding category, and the table Categories include the different categories that groups the sales.

Sales table definition:

table.schemaName = “GEO_AGG”;

table.tableType = COLUMNSTORE;

table.columns = [{name=”ID”;sqlType=INTEGER;nullable = false;},

{name=”STORE”;sqlType=INTEGER;nullable = false;},

{name=”CATEGORY”;sqlType=INTEGER;nullable = false;},

{name=”VOLUME”;sqlType=DOUBLE;nullable = false;},

{name=”UNITS”;sqlType=BIGINT;nullable = false;},

{name=”PROMOS”;sqlType=DOUBLE;nullable = false;},

{name=”PREDICTION”;sqlType=DOUBLE;nullable = false;}];

table.primaryKey.pkcolumns=[“STORE”,”CATEGORY”,”ID”];

Stores table definition:

table.schemaName = “GEO_AGG”;

table.tableType = COLUMNSTORE;

table.columns = [{name=”ID”;sqlType=INTEGER;nullable = false;},

{name=”STATE”;sqlType=SHORTTEXT;nullable = false;length=100;},

{name=”LONGITUDE”;sqlType=DOUBLE;nullable = false;},

{name=”LATITUDE”;sqlType=DOUBLE;nullable = false;}];

table.primaryKey.pkcolumns=[“ID”];

Categories table definition:

table.schemaName = “GEO_AGG”;

table.tableType = COLUMNSTORE;

table.columns = [{name=”ID”;sqlType=INTEGER;nullable = false;},

{name=”CATEGORY”;sqlType=SHORTTEXT;nullable = false;length=100;},

{name=”DESCRIPTION”;sqlType=SHORTTEXT;nullable = false;length=250;}];

table.primaryKey.pkcolumns=[“ID”];

Then was created the view Geo_Stores, this has two calculated fields, Location with a ST.Point object, and the other with the location of the point in standard format WKT.

Geo_Stores view definition:

schema=”GEO_AGG”;

query=”SELECT *, NEW ST_POINT(Latitude,Longitude) AS Location, NEW ST_POINT(Latitude,Longitude).ST_AsWKT() AS Location_txt \

FROM \”GEO_AGG\”.\”mexbalia.Geo_Agg.DB::Stores\””;

The view Aggregations calculates the sum of all the sales grouped by the store and category, this view was created to simplify the query used in the SX service.

Geo_Stores view definition:

schema=”GEO_AGG”;

query=”select \”STORE\”,\

\”CATEGORY\”, \

sum(\”VOLUME\”) AS \”VOLUME\”,\

sum(\”UNITS\”) AS \”UNITS\”,\

sum(\”PROMOS\”) AS \”PROMOS\”,\

sum(\”PREDICTION\”) AS \”PREDICTION\”\

from \”GEO_AGG\”.\”mexbalia.Geo_Agg.DB::Sales\” GROUP BY \”STORE\”,\”CATEGORY\” ORDER BY \”STORE\” ASC”;

Then the services  for web consumption was created, one is the file Services.xsodata, this service provides the information of the table Categories. The service Stores_Coords.xsjs accepts as parameters a definition string of a polygon in format WKT and a string with a list of the category id’s of interest, the response of the service is a list with the stores contained in the polygon and the sum of all the sales that satisfies to be in one of the selected stores and to be in one of the categories selected.

OData service definition:

service {

        “GEO_AGG”.”mexbalia.Geo_Agg.DB::Categories” as “categories”;

}

Stores_Coords.xsjs service definition:

var polygon = $.request.parameters.get(“polygon”);

var categories = $.request.parameters.get(“categories”);

if (categories=='()’){categories = ‘(0)’;}

if(/^[0-9,()]+$/.test(categories)){

var connection = $.db.getConnection(“mexbalia.Geo_Agg.UI::Anonymous_Access“);

var GetStores = connection.prepareStatement( “SELECT * FROM \”GEO_AGG\”.\”mexbalia.Geo_Agg.DB::Geo_Stores\” WHERE NEW ST_Polygon(?) .ST_Contains( location )=1″ );

GetStores.setString(1,polygon);

var ResultRow =GetStores.executeQuery();

var StoresResponse=[];

var current_object={};

var StoresList = [];

while(ResultRow.next()){

var StoreID = ResultRow.getString(1);

current_object={“ID”:StoreID,”STATE”:ResultRow.getString(2),”LOCATION”:ResultRow.getString(6)};

StoresList.push(StoreID);

StoresResponse.push(current_object);

}

var StoresList_str = ‘(‘+StoresList.toString()+’)’;

if (StoresList.length == 0){StoresList_str='(0)’;}

var AggregationsResponse=[];

var GetAggregations=connection.prepareStatement(“select sum(Sales.volume) as volume,sum(TO_BIGINT(Sales.units)) as units,sum(Sales.promos) as promos,sum(Sales.prediction) as prediction “+

“from \”GEO_AGG\”.\”mexbalia.Geo_Agg.DB::Aggregations\” as Sales where Sales.store in “+StoresList_str+” and Sales.category in “+categories);

var ResultRow =GetAggregations.executeQuery();

var current_object={};

$.response.contentType = “json“;

while(ResultRow.next()){

current_object={“VOLUME”:ResultRow.getString(1),”UNITS”:ResultRow.getString(2),”PROMOS”:ResultRow.getString(3),”PREDICTION”:ResultRow.getString(4)};

AggregationsResponse.push(current_object);

}

var respuesta = {map:StoresResponse,table:AggregationsResponse};

$.response.setBody(JSON.stringify(respuesta));

ResultRow.close();

GetStores.close();

connection.close();

}

else{$.response.contentType = “text”;

$.response.setBody(‘Invalid Categories List’);

}

The most important part of the process is the Stores_Coords.xsjs service, it executes a query creating an object of class polygon with the WKT string parameter, and using the return of it’s function CONTAINS as a condition for the where clause, the result is a selection only of the stores inside of the polygon defined.

This is the query used to get the contained stores:

SELECT * FROM “GEO_AGG“.“mexbalia.Geo_Agg.DB::Geo_Stores” WHERE NEW ST_Polygon( ‘WKT Polygon definition’ ) .ST_Contains( location )=1

The results of this query are used to create an expression list with the ID’s of the contained stores, the next query calculates the aggregations needed using this list with the expression list of the input parameter categories.

select sum(Sales.volume) as volume,sum(Sales.units) as units,sum(Sales.promos) as promos,sum(Sales.prediction) as prediction from “GEO_AGG“.“mexbalia.Geo_Agg.DB::Aggregations” as Sales where Sales.store in (Expression list of stores) and Sales.category in (Expression list of categories)

The first result of these queries are the response of the Service, this service is requested b y the interface trough an XHR call and the response is sent to a WKT->GMaps converter called Wkt_conv to get the locations in a GMaps format and add them to the map, the second result is  bound to the bar graph.

Example of WKT to Google conversion:

wkt_conv.read(‘WKT geometry definition’);

var GoogleObj = wkt_conv.toObject();

Example of Google to WKT conversion:

wkt_conv.fromObject(GoogleObj);

var Wkt_string = wkt_conv.write();

The input parameter categories of the stores_Coords.xsjs service is generated with the checkboxes of the UI, and the parameter polygon is generated with the information of the polygon defined by the user in the map, and converted to WKT using Wkt_conv.

This is a simple way to solve the problem when the number of stores is small, supportable for a browser, the creation of a grid and usage of the ST_SnapToGrid function becomes necessary in the case of a large number of stores. 

To report this post you need to login first.

4 Comments

You must be Logged on to comment or reply to a post.

  1. Joseph Reed

    Great work!!! Really helpful Christian Ladrón de Guevara.  This should be very simple but Im an ABAP programmer and not familiar with Eclipse until recently.  Im having trouble taking code from Github and loading it properly into a project.  I use the SAPUI5 project type then I add any folders and resources like js files all to the root directory.  However when I attempt to specify the source. For example <script src=”/sap/ui5/1/resources/sap-ui-core.js” this works; however for custom loaded files like src=”/mexbalia/Geo_Agg/UI/src/wicket.js” I get a unable to find source error.  I imagine its the way I lay out my folder structure.  However /mexbalia/ is not in the GitHud as a folder and I dont see Jquery setting the path /mexbalia/.

    Could you help me understand a simple way of knowing how to reference the source of a js file?  I have other openui5 libraries I would like to reference also but cant seem to get the code to find them. I’m not sure if its a missing ‘/’ or missing ‘,,,’.  I have searched for another post which might explain this in a way I can understand but I have not found  one yet,  You seem to be very well educated about openui5 and how to write these source references.  I would appreciate your help with this if you would.

    (0) 
    1. Christian Ladrón de Guevara Post author

      Hi Joseph, thanks for your comment, i haven’t realize that I didn’t write the installation steps. You need to copy the repository to a xs project called Geo_Agg shared into the content package mexbalia.

      /sap/ui5/1/resources/sap-ui-core.js works fine because there is a package route sap.ui5.1.resources::sap-ui-core.js , but you don’t have  wicket.js in the route mexbalia.Geo_AggUI.src 

      (0) 

Leave a Reply