Skip to Content
Technical Articles
Author's profile photo Jon Gooding

Google BigQuery to SAP Data Warehouse Cloud DWC

Clients who have an investment in Google Big Query and SAP related technologies (including ECC, SOH, BW, BW/4HANA, S/4HANA, SuccessFactors, Ariba and FieldGlass as some examples) can easily integrate the datasets, while not having to duplicate the datasets further by leaving them in their respective environments.

The SAP Data Warehouse Cloud Big Query connection type provides the ability to link BigQuery Datasets to SAP technologies and to provide a virtualisation layer that can blend the data sets to provide extra business insights across the different systems.

The integration with SAP technologies in Data Warehouse Cloud is well tested and proven, this blog will show the extra steps in integrating Google Big Query in with Data Warehouse Cloud.

Google Big Query setup requires the following;

An account with tables loaded /accessible:


I have uploaded 2 tables, planning and sales to the AUSTRALIA dataset in the project sap-anz-dto. Nothing special, just csv files from a previous demo.

External Access also requires a Service Account:


Selecting the BigQuery service account, then creating and download the key, this will be used in the DWC Connection (the string after f5c has been cut off):



A Google Cloud certificate is also required in Data Warehouse Cloud to connect to the Google BigQuery landscape:

  • GSR2 : Located at:, under the section “Download CA certificates”, expand “Root CAs”, for “gsr2” first entry, click action button at the right, click “Certificate (.PEM)”, which should download the gs2 pem certificate.


Within the Space Connections, add the BigQuery Connection Type:


Entering the requirements values:

Project: sap-anz-dto



From the BigQuery Service Account:

upload the json file:


Test the Connection. If you have any issues – post them below. I had a few certificate issues on the way to getting the above working.

Finally, you should be able to see the big query connection tables:

And then view the data:


Next step is to test some larger datasets and test the different virtualisation / replication options…

Pretty easy in the end really.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Klaus Freyburger
      Klaus Freyburger

      Thanks a lot for this blog. This is exactly what I am looking for. Yes I do have an issue.

      I downloaded the certificate pem file exactly like you described.

      When trying to upload the certificate pem file I get a message "Error, Certificate could not be uploaded". No further hint.


      Error Message

      The certificate file looks like described here

      namely "-----BEGIN CERTIFICATE-----" and "-----END CERTIFICATE-----".

      Any idea what I am doing wrong?



      Author's profile photo Jon Gooding
      Jon Gooding
      Blog Post Author

      Hi Klaus,

      I assume you are having issues with the google gsr2 pem cert? from

      That is the file you used?

      Author's profile photo Klaus Freyburger
      Klaus Freyburger

      Hi Jon,

      meanwhile we found out that there is a problem with technical user "User dis-user does not have the required privileges"

      Our system admin takes care of this.

      Br Klaus

      Author's profile photo Marianne Loenen
      Marianne Loenen

      HI Klaus, Jon,

      One remark for future people that run into the same issue (we did last week) and land here.
      This error is something that can only be fixed in the back-end of DI.

      You will need an OSS ticket to get it fixed.


      Author's profile photo Klaus Freyburger
      Klaus Freyburger

      Hi Jon,

      meanwhile the issue is fixed, certificate can be uploaded and connection can be established. Hoewever I get this message

      And indeed. It works fine in view building. Connection does not show up in data flow. Do you know why?

      Thanks Klaus

      Author's profile photo Jon Gooding
      Jon Gooding
      Blog Post Author

      Hi Klaus Freyburger ,

      Good progress. Yes Google Big Query is not supported as a direct Data Flow option as yet. But as a workaround;

      1. Use the BQ object as a source to Analytical Dataset View (Relational Dataset didn't work)
      2. Used the Analytical Dataset as a source to the Data Flow - and was able to use the data flow functionality into another table.

      Hope that helps


      Author's profile photo Klaus Freyburger
      Klaus Freyburger

      Hi Jon,

      thanks for your reply. Maybe you are interested in the workaround I found in the meantime.

      1. start creating Graphical View
      2. add BQ table as remote table
      3. quit without saving
      4. create Data Flow
      5. add remote table from  step 2 to Data Flow.

      This works fine without an extra view. The essential hint came from your blog. So thanks again!

      Best regards Klaus