Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Ipsita_Behera
Explorer
0 Kudos

In SAP Implementation one of the most critical aspects is reporting. As SAP data is being integrated with many other Non SAP system’s to create reports in systems such as looker, this blog explores the option of how SAP can be Integrated to GBQ using GBQ Connector.

This blog explores the option of data transfer from SAP to GBQ using embedded SLT in S/4HANA.

1        System Considerations:

  • S/4HANA Rise Private Edition
  • S/4HANA Embedded SLT
  • Google Big Query

2        Scenario

As part of this blog, we will be setting up data transmission from SAP S/4HANA system hosted in GCP to GBQ using GBQ Connector and embedded SLT.

ipsita2_0-1706462518011.png

3        Infrastructure Setup between SAP and GBQ

The official guide from Google for the setup is available over here. (https://cloud.google.com/solutions/sap/docs/bq-connector/latest/all-guides )

The infrastructure setup can be divided into three part, depending on where the SAP system is hosted. This use case considers S/4HANA Private cloud   in Compute Engine virtual machine (VM) on Google Cloud.

3.1      SAP Application server Infrastructure setup

  • Install gcloud CLI in S4HANA OS :If not already available , install gcloud CLI in OS of application server of SAP and verify that it is installed using command gcloud -v

ipsita2_1-1706462518025.png

  • Cloud API access scopes for Application VM of SAP : This can be provided in two way, one being full access to all API and the other is  only provide Big Query and Cloud Platform API access. This access should be enabled in all application server of SAP system .

ipsita2_2-1706462518034.png

  • Enable the host VM to obtain access tokens : The below permission should be provided to the service account associated with the VM of SAP application server.

Service Account Token Creator, BigQuery Data Editor , BigQuery Job User

ipsita2_3-1706462518040.png

3.2      GCP Infrastructure Setup

As this use case considered SAP Rise system is hosted in GCP, we have 2 steps to perform in the customer’s GCP environment to allow

Data transfer.

  • Create a Big Query Dataset in GBQ: As this process varies across organizations , please create a Big Query Dataset in your environment and take a note of the project name and dataset name . This information will be later used for the SLT configuration.
  • Allow SAP VM Service Account access in Customer GCP: In this use case GBQ Dataset will be created is separate project from SAP VM. The GCP admin where the GBQ dataset was created should provide access to Service account of SAP VM so it can write data into the dataset.

4        Configuration Setup

As the guide covers detailed instructions to be performed, this blog will demonstrate highlights of the configuration in SAP. Feel free to ask any questions in the comment section of the blog for additional information.

ipsita2_4-1706462518044.png

  • Set up SSL certificates in STRUST : Download the relevant root certificate (GTS Root R1 ,GTS CA 1C3)  for GCP and import into SSL Client PSE of SAP.
  • Specify access settings in /GOOG/CLIENT_KEY:  The below values are considered in this setup.

Service Account Name : The Service Account associated with SAP Application Server VM is used

Scope: https://www.googleapis.com/auth/bigquery can be used if the permission were restricted in earlier step

Project ID : The project name created is target GCP which contains the GBQ dataset

ipsita2_5-1706462518047.png

ipsita2_6-1706462518051.png

  • Create the RFC in SM59 : Create 2 Type G RFC for GCP API and One Type 3 for SAP RFC (Own system ) for data transfer

ipsita2_7-1706462518056.png

ipsita2_8-1706462518062.png

  • Create SLT Configuration using LTRC

ipsita2_9-1706462518074.png

ipsita2_10-1706462518079.png

  • Configure /GOOG/SLT_SETTINGS : configure a mass transfer for BigQuery and specify the table and field mappings.

Project Identifier: This is the project name from Target GCP

BQ Dataset: Name of the GBQ Dataset

ipsita2_11-1706462518082.png

ipsita2_12-1706462518087.png

5        Test Replication:

After the above setup, test the replication using LTRC. All the other relevant configurations for SLT can be used in the above use case as well (LTRS) .

ipsita2_13-1706462518097.png

ipsita2_14-1706462518106.png

6        Considerations

  • This setup is intended for replication where the data transfer is not in large scale as there will be performance impact to SAP system for embedded SLT scenario.
  • Check with SAP ECS team regarding the security approval required for the changes to the SAP VM access.
  • Advance performance replication for certain larger table such as ACDOCA should be considered using Performance Optimization Guide for SLT
  • As the use case had considered embedded SLT, certain performance optimization option will increase the database table size for temporary logging table.
  • Periodically program CNV_NOTE_ANALYZER_SLT to check for latest available SAP notes which can be applied to improve and fix any bugs associated with SLT.

Resources

https://cloud.google.com/solutions/sap/docs/bq-connector/latest/install-config-on-gc#prereqs

https://help.sap.com/doc/04989b617c6f4ca8982fde703e7350f9/3.0.04/en-US/Performance%20Guide.pdf

2014562 - FAQ: SAP HANA LT Replication Server (SLT)

2940799 - Best practices for loading/replicating a table with a view – SLT

3016862 - DMIS Note Analyzers with separated scenarios for ABAP-based Migration and Replication Technology

Please keep an eye out upcoming blog about lessons learnt as well as optimization option we have implemented for above use case.

 

 

 

Labels in this area