Skip to Content
Author's profile photo Kevin Small

Analysing UK Road Traffic Accident data with HANA and SAPUI5

This blog demonstrates how to load, transform and analyse UK road traffic accident data on mobile devices with HANA, SAPUI5 and Google Maps.  The end results look like this on an iPhone, analysing data around London with a heatmap view and also a marker view with accident details:

iPhone Screenshots.png

The same SAPUI5 mobile application runs across browsers on mobiles and tablets, with touch support to pan and zoom around the map, and it also runs in supported desktop browsers.  Here is the same application on an iPad analysing Birmingham with a heatmap, you can see a “hot” area at a roundabout in the South West of the map:

iPad Heatmap Birmingham.png

This blog was inspired by the UK Department for Transport (DfT) online tool which allows analysis of data (with markers) by local authority.  The aim was to take that work further and display heatmaps, larger aggregations of data and also to make the analysis application aware of mobile devices.  The finished app runs on a mobile device, and so it can retrieve the user’s location and immediately display accident statistics in the vicinity.

I’ll cover the end to end flow starting from the source data, tables, then some data transformation using XSJS, creation of analytic and attribute views, oData services, and finally the creation of a generic mobile client in SAPUI5.  The client makes all the Google Maps calls to get maps, markers, infowindows and the heatmaps.  All the HANA objects and source code used are available in GitHub.  I also supply some test data for you to play with, as the real data set you need a license for (it is free, but you need to apply) [Update Oct 2014 – see notes at end of blog, the source datasets are now much easier to access].  The data flow looks like this:


Plenty blogs have already covered the detail of creating each of these objects, so I’ll just pick out the interesting things.

1) Source Data

The DfT provide a rich dataset with information on location (to within 10 meters), date, time, weather, road surface conditions, lighting, vehicles and so on.  There are around 1 million records per year across several tables, so it is not vast and fits comfortably into my HANA dev box.  I FTP’d the files to HANA and then imported them, but for the test data I supply you could do a straight import from HANA Studio.

2) Tables

The main tables are defined in a single file Road.hdbdd, this gives the convenience of shared type definitions across tables.  There is one additional table “constants.hdbtable” that auto-loads some master data when it is activated.

3) Data Transformation

This was the most time consuming part of the project.  The source data accident locations are given in a coordinate system called Easting-Northing.  This is the system used by UK Ordnance Survey maps.  Most GPS-type applications use the more familiar Latitude-Longitude coordinate system used by mobile devices and Google maps.  Conversion between the systems is surprisingly complicated, the gist of it is to do with projecting a round earth onto a flat plane.  Then it turns out there are no such thing as flat planes (they all have to curve), or fixed points (because the plates of the Earth move), and any coordinate conversion is approximate.  It seems you literally cannot trust the ground you walk on!  The bottom line is that the coordinate conversion from Easting-Northing to Latitude-Longitude requires hundreds of lines of calculations and won’t be happening in real time for every point.

Therefore, the source data needs transformed to add in Latitude and Longitude, and this is where a HANA design decision by SAP really helped me out.  JavaScript is so ubiquitous that someone has already written a library to do the conversion, so I just needed to tweak that library a bit and turn it into an XSJS lib.  Easy!

The files that make the transformation are:

transformLatLon.xsjslib – the tweaked library to do the complex conversion calculations

updateAccBulk.xsjs – the XSJS service to bulk change many records

I manually ran the XSJS service just by using the service URL a few times, but you likely could use cron jobs and cURL (thanks Aron MacDonald for the idea).

4) Data Model – Attribute and Analytic Views

This is a straightforward star schema type affair. There are multiple Attribute Views to provide master data texts, and a top level Analytic View which joins these in to the base tables.

5) oData Service

The service definition “getRTA_ACC_Minimal.xsodata” is very simple as the views have done all the join work:

service namespace "roadsafety"
   "roadsafety.models::AN_ACC" as "getRTA_ACC_Minimal"
   "AT_DAYOFWEEK_Description", "AT_WEATHER_Description", "AT_LIGHT_Description",
   "AT_SURFACE_Description", "VEHCOUNT", "CASCOUNT")
   key ("ACCREF", "ACCYEAR")
   aggregates always;

The base data set has lots of interesting fields, but for this demo I chose just a handful.  The oData service will allow selection by any field listed, but the intention is that when it is called it will receive a “bounds window” of maximum and minimum latitude and longitude coordinates according to the size of the map currently being displayed on the client.  Which leads us neatly onto…

6) SAPUI5 Mobile Frontend

The SAPUI5 frontend is based on best practice MVC templates, but the end results are not entirely best practice :).  I know there too many global variables and the frontend still has bugs when you flick between the map types, but it is cross-browser and functional in pretty few lines of code.  Source code is all available and commented.

Entry point

The entry point for most of the clients is the same:


The exception is that desktop browsers that need to add eg “?sap-ui-xx-fakeOS=ios” on the end of the URL.  If you’re using this code you also need to enter your own Google Maps API key in the main index.html file.  SAPUI5 makes a great job of abstracting away cross-browser differences, and this same UI code runs fine on all the devices I’ve tested it on.

Detecting location

Detecting location turned out to be easier than I expected.  Modern browser support has become quite standardised and this handy site lets you see what browsers support what.  If you go to that site and scroll down to the JavaScript API section you can see that pretty much all current browsers support Geolocation.  This means there is no need to “go native” if you want to read the location of device, it is pretty much just one JavaScript call:


Calling oData

The frontend makes an oData call providing a bounds window of latitude and longitude coordinates.  The call ends up looking something like this:

$filter=LATITUDE gt 51.3m and LATITUDE lt 51.7m and LONGITUDE gt -0.18m and LONGITUDE
lt -0.14m&$top=2000&$format=json

The bounds windows is retrieved from the Google Maps API and match the bounds of the client’s window.  This way we only retrieve the data we need.


If you are developing with SAPUI5 in a similar sort of way, be aware:

  • Using the Google Maps infowindows only seems to work on actual iOS devices, not in a desktop with “fake os = true” (thanks to Konstantin Anikeev for saving me headaches here).
  • To allow Google Maps heatmap overlays to work, make sure “allow cookies” is on on your iPhone.

7) Limitations

The limitation of this project is the client handling of large volumes.  HANA of course handles volumes fine, but not so the frontend because it is the client that is responsible for communicating with Google Maps.  If you look back at the data flow diagram and imagine we want to see a heatmap of the whole of the UK in one go.  We need to send several million rows to the client, which in turn sends millions of rows to Google which aggregates and returns the markers and heatmap to the client.  And that’s assuming Google allows that much traffic to it’s free service.  Then the Google Maps frontend itself has limitations.  Things slow a little after a few hundred markers and around 5000 heatmap points.

The DfT website got around these limitations in a couple of ways, first by using a MarkerClusterer and also by limiting views to only a single local authority at a time, so you cannot view UK wide or any large dataset.

8) Further work

a) UK wide view

To go a step further and do what the DfT don’t do, HANA must do the heavy lifting of clustering the heatmap points and markers.  Using a naive algorithm (i.e. my own one) in SQLScript using cursors turned out to be absurdly slow.  Perhaps passing the dataset to R to do the clustering, returning it to HANA and storing this persistently on the base accident records is a better approach.  That is work for another day.

b) Warning Vibrations

One idea I had was to have a mobile client run a background service that would make the phone give a warning vibration whenever you travelled into a region of high risk.  Since the phone knows the time of day and weather, this concept could be take further and match against similar data sets for the climate and light.  The phone could even guess if you were walking or in a car according to how fast you were moving and match against vehicle types involved.

To achieve this, the client would have to run under the phone lock screen, and access the phone vibration API as well as the Geolocation API.  This means only a native (or possibly hybrid) mobile client would be possible.  This would be fairly easy to code if you were familiar with the target platform.

The Small Print

Data sets used for heatmap images:

Department for Transport. Road Accident Statistics Branch, Road Accident Data, 2009 [computer file]. Colchester, Essex: UK Data Archive [distributor], August 2010. SN: 6513,

Department for Transport. Road Accident Statistics Branch, Road Accident Data, 2010 [computer file]. Colchester, Essex: UK Data Archive [distributor], December 2011. SN: 6926,

Copyright Notice: The original data creators, depositors or copyright holders, the funders of the Data Collections (if different) and the UK Data Archive bear no responsibility for their further analysis or interpretation.  Crown copyright material is reproduced with the permission of the Controller of HMSO and the Queen’s Printer for Scotland.

Update October 2014 – The traffic accident datasets have now been released under an Open Government License.  The released data contains 10+ years of accident data in a single Excel workbook, that’s 1.4 million accident records, and they’ve now added latitude and longitude instead of just supplying easting and northing.  You no longer need to apply for a license (although you do have to abide by usage rules) and the data is now much easier to consume.  See: Road Safety Data – Datasets | for more information.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Former Member
      Former Member

      reading this gave me an idea of how powerful HANA can be.

      great work.

      Author's profile photo Jon-Paul Boyd
      Jon-Paul Boyd

      Great execution by way of communicating interesting data in a stimulating way.

      Author's profile photo Former Member
      Former Member

      Excellent Stuff Kevin.

      Author's profile photo Former Member
      Former Member

      very interesting blog

      Author's profile photo Former Member
      Former Member

      Love it, thanks Kevin for sharing.

      Author's profile photo Former Member
      Former Member

      Thanks for sharing Kevin.

      I got your content from Github. All is fine, except for the activation of views. For example, AT_ACC has error at ACCYEAR, ACCREF, LATITUDE, LONGITUDE. Other views have similar errors as well.

      Do you have a suggestion on how to fix these error?

      Author's profile photo Kevin Small
      Kevin Small
      Blog Post Author

      Hi Michelle, what error message are you getting exactly?

      Author's profile photo Former Member
      Former Member

      For AT_ACC, this is the error Kevin.

      All tables are activated with no error, after taking out the 'nullable'.


      Author's profile photo Kevin Small
      Kevin Small
      Blog Post Author

      Hi Michelle,

      Perhaps something has gone wrong with upper/lower case field names.  This was developed on SP5 (that's my excuse anyway) and so in SP7 or SP8 things might be defaulting differently.

      When I tried importing the delivery unit I get similar errors as you, and I can see the table has fields like "accyear" but the model wants a field called "ACCYEAR".  So I'm thinking either change table or rebuild the model with matching case?