Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
pascal_wasem1
Explorer

Providing COVID-19 time series via @Sap/cds


While staying a lot at home these days I was browsing through the many available COVID-19 related statistics and charts such this Visual Dashboard.

Most numbers you see on the news rely on the data provided by the Johns Hopkins University Center for Systems Science and Engineering (JHU CSSE) on a daily basis in their main upstream repository on GitHub.

After a little googling I also found an additional source for normalized time series provided by DataHub.io relying on the very same data.

However what I did not find was an OData v4 based service for retrieving any COVID-19 related data which might be helpful for others because this is their preferred way for consuming data, e.g. in a business context.

As the data provided by JHU CSSE and DataHub.io is mainly distributed via .csv files the solution was pretty simply:

  • Download the latest csv files from DataHub.io

  • Deploy these csv files as initial data to a @sap/cds based OData v4 service


Defining the domain model


As with any @sap/cds project the first step is to define the domain model which is straight forward, e.g. as for the countries-aggregated time time series:
using {sap} from '@sap/cds/common';

namespace covid19;

entity CountriesAggregated {
key Date : Date;
key Country : String;
Confirmed : Integer;
Recovered : Integer;
Deaths : Integer;
}

Defining the service


With our domain model in place it is quite easy to define our service referencing our previously declared entities as read only:
using {covid19} from '../db/schema';

service Covid19Service {
@readonly
entity CountriesAggregated as projection on covid19.CountriesAggregated;
}

Importing the data


The final step is to download the matching csv files and deploying these as initial data to a sqlite3 database. We just have to take care of downloading the files to the right folder and to ensure proper file names. All the heavy lifting will then be handled by the @sap/cds module itself.

All the above logic can be implemented in a simple grunt task as follows:
const path = require('path')
const fs = require('fs-extra')
const stream = require('stream')
const util = require('util')
const finished = util.promisify(stream.finished)
const changeCase = require('change-case')
const { Dataset } = require('data.js')

const endpoint = 'https://datahub.io/core/covid-19/datapackage.json'
const names = ['countries-aggregated']

module.exports = (grunt) => async function () {
const done = this.async()
try {
// ensure csv dir
const dirpath = path.join(process.cwd(), 'db', 'data')
await fs.emptyDir(dirpath)
// load dataset
const dataset = await Dataset.load(endpoint)
// get all tabular data (if exists any)
for (const id in dataset.resources) {
const resource = dataset.resources[id]
const { name, format } = resource._descriptor
// filter resources
if (format === 'csv' && names.includes(name)) {
// write resource to .csv file
const readable = await resource.stream()
const filepath = path.join(dirpath, `covid19-${changeCase.pascalCase(name)}.csv`)
const writable = fs.createWriteStream(filepath)
readable.pipe(writable)
await finished(writable)
}
}
done()
} catch (error) {
grunt.log.error(error.stack)
done(error)
}
}

Deploy the data and run the service


Now we can simply deploy the data and start the service, e.g. by adding scripts in our package.json:
{
"scripts": {
"prestart": "npx grunt && npx cds deploy",
"start": "npx cds run"
}

npm start

Now the service should be up and running and we can start querying using standard Odata v4 syntax:
http://localhost:4004/covid19/CountriesAggregated?
$filter=Country eq 'Germany'"

 

Would be great if this turns to be useful for anyone!

All the sources can be found in this repository:

https://github.com/pwasem/covid19-cds

 

Please #StayHome and feel free to contribute!

 
  • SAP Managed Tags: