Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
jaskisin
Participant
This is in continuation of my previous Article Auto Scaling of SAP Systems on Azure - Part I

3. PREREQUISITES


For taking advantage of the above architecture below are some of the pre-requisite

3.1 Authorization


Configuring user must have access for the below: -

  • Read Performance Data from SAP System

  • Get the secrets from Azure Key Vault

  • Creation of Managed Identity, Azure Storage Account, Log Analytics Workspace, ISE for Logic Apps, Azure Monitor, Data Gateways.

  • Creation and Using ARM templates to deploy VM

  • Read and Write access to Storage Account

  • Full access on Azure Automation and Run Books


3.2 Prepare SAP for Monitoring


3.2.1 Start Snapshot Monitoring


To accurately measure the load on an SAP application server, SAP specific performance metrics like work process utilization, user sessions, SAP application memory usage etc. are required. SAP provides a snapshot monitoring utility SMON (or /SDF/MON) which collects this information and stores it in a transparent table (header table) within the SAP database.

Goto Tcode /sdf/mon and click on Schedule Daily Monitoring


Fill out all the required settings that we need for the monitoring records and click Execute


Below screen will appear when the schedule has been saved with the required settings which we have selected


We can also double click on the saved entry to check if the data gathering of Monitoring stats are happening as per the expectation


Now, our performance data is getting collected from the SMON daily schedulers

3.2.2 SAP Development Package


We also need to make data available for 3rd party tools, we need to make some changes to the SAP Objects so that this data can be read by the external Azure services with proper authentication. To do the same we need to create a custom Development Package to store the changes. If we already have the Development Package then we can skip this step.

Goto Tcode SE80 and Select Package from DropDown Menu and Search for the Package, pop will came to create the object Click Yes


Specify all the details that is required to create the Package as below and then click on Tick


After successful creation of Package, we can see the below screen: -


Now, our package is ready to hold the custom objects used for Auto-scaling telemetry data.

3.2.3 SAP Gateway Service Builder


Since we have all the data stored inside the SAP system, hence we need to create a Gateway service by which Azure Service can be allowed to enter the SAP system to read that telemetry data with proper authentication.

Goto Tcode SEGW and then click on Create


Specify all the requested details about the Gateway service and then click on Tick


Once the Gateway Service has been created then below screen with success message will appear


Then as we need to import the structure from the DDIC object hence right click on Data Model → Import and then click on DDIC Structure


Specify the name of the entity and the name of the ABAP Structure for which we need to build the Data Model and click on Next


Select the fields of select ABAP Structure, which is required for evaluating the data, here we are choosing the parameters according to the Auto-scaling configuration and click on Next



Specify the fields which needs to be kept as Key Fields and click on Next


Once the Entity type has been imported successfully to the Data Model then below screen will appear


Now we need to go to the edit mode of the Gateway Service Builder and click on Generate button to generate all the relevant objects


Specify the names of the objects that gets generated during this process and then click on Tick


After the successful generation of the objects, we can see them as below with Green Status


Since we want this service to get all the telemetry data that is collected by the SMON jobs, we need to change some ABAP codes to achieve the same. We need to change the code of GET_ENTITYSET Method to
data: lv_osql_where_clause type string.
lv_osql_where_clause = io_tech_request_context->get_osql_where_clause( ).
select * from /sdf/mon_header
into corresponding fields of table @et_entityset
where (lv_osql_where_clause).



3.2.4 Activate and Maintain Services


Since our Gateway Service is ready to access the telemetry data, hence we need to configure and activate the Services which allows external sources to Gateway Service. For doing the same goto t-code  /IWFND/MAINT_SERVICE and click on Add Service


Search for the Backend Services by below criteria: -


This will show the Gateway Service which we have built earlier, then click on the service: -


Specify and cross check all the details popped up in next screen and then click on Tick


Below message will pop up once the service is created successfully: -


We can also see the same newly created service to the list: -


Now, our Service is available for the external tools to read the monitoring data from the SAP system.

3.2.5 Verify with SAP Gateway Client


Since all the settings are in place, hence we can use the SAP Gateway Client to verify the configuration of accessing the telemetry data. For doing the same, we can go the tcode /IWFND/GW_CLIENT and try to GET the response using the Request URI of the Gateway service as below:-


 

This is end of Part – II, in next part Auto Scaling of SAP Systems on Azure - Part III of this article you can find more information about Deployment and Configuration of On-prem Gateway of this Solution

 
  • SAP Managed Tags:
Labels in this area