Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
jaskisin
Participant
This Article is referenced from the approach mentioned by Amazon at SAP on AWS blog which has been covered in 2 parts.

Organizations which are running there SAP workloads on AWS can take advantage of these set of additional services that can enhance and simplify the operations of running SAP on AWS infrastructure. Mostly, organizations are keen about their compliance and has initial requirement to check and validate that the running SAP systems are configured according to the recommendations suggested by SAP as well as AWS, to meet such internal Compliance requirements this article can be used in there environment.

SAP has certified AWS infrastructure for SAP products, but it should be set as per the documents provided by SAP and AWS. In this Article, we are going to discuss how AWS Config along with some AWS integrations can simplify the process of continuous checking and validating the compliance of all SAP systems deployed in our landscape on AWS.

1. AWS SERVICES


1.1 AWS Config


AWS Config is a service that enables us to assess, audit, and evaluate the configurations of our AWS resources. Config continuously monitors and records of our SAP resources which hosts on AWS and allows us to automate the evaluation of recorded configurations against desired configurations. With Config, we can review changes in SAP configurations and relationships between AWS resources, dive into detailed resource configuration histories, and determine our overall compliance against the configurations specified in our internal organizational guidelines.

1.2 Amazon EventBridge


Amazon EventBridge is a serverless event bus that makes it easier to build event-driven applications at scale using events generated from our SAP applications and resources. EventBridge delivers a stream of real-time data from event sources such as AWS Config to targets like AWS Lambda and other SaaS applications such as SNS topic. Here we will use Amazon EventBridge to generate events based on compliance data which we get from AWS Config and forward the to SNS Topics for further processing of this.

1.3 Amazon SNS


The A2A pub/sub functionality provides topics for high-throughput, push-based, many-to-many messaging between distributed systems, microservices, and event-driven serverless applications. Using Amazon SNS topics, our publisher systems can fanout messages to many subscriber systems including Amazon SQS queues, AWS Lambda functions and HTTPS endpoints, for parallel processing, and Amazon Kinesis Data Firehose. With SNS we can send the notifications about the non-compliant details to the required stakeholders.

1.4 Amazon CloudWatch


Amazon CloudWatch is a monitoring and observability service built for DevOps engineers, developers, site reliability engineers (SREs), and IT managers. CloudWatch provides us with data and actionable insights to monitor our applications, respond to system-wide performance changes, optimize resource utilization, and get a unified view of operational health. CloudWatch collects monitoring and operational data in the form of logs, metrics, and events, providing us with a unified view of AWS resources, applications, and services that run on AWS and on-premises servers. We can use CloudWatch to troubleshoot the issues with regards to the services which we are using to check the compliance of the SAP Systems

1.5 AWS Secrets Manager


AWS Secrets Manager helps us protecting secrets/credentials needed to access our SAP applications, services, and IT resources. The service enables us to easily rotate, manage, and retrieve application credentials, API keys, and other secrets throughout their lifecycle. This eliminates the need of hardcoding sensitive information in plain text. In addition, Secrets Manager enables us to control access to secrets using fine-grained permissions and audit secret rotation centrally for resources in the AWS Cloud, third-party services, and on-premises. We will be using this service to store credentials for the user by which compliance will be checked from SAP system.

1.6 AWS Lambda


AWS Lambda is a serverless compute service that lets us run code without provisioning or managing any servers, creating workload-aware cluster scaling logic, maintaining event integrations, or managing runtimes. With Lambda, we can run code virtually for any type of application or backend service - all with zero administration. We can write Lambda functions in our favorite language (Node.js, Python, Go, Java, and more) and use both serverless and container tools, such as AWS SAM or Docker CLI, to build, test, and deploy our functions. Here we will be building the python code which will be responsible for getting the required compliance data from SAP System for AWS Config to evaluate.

1.7 Amazon DynamoDB


Amazon DynamoDB is a key-value and document database that delivers single-digit millisecond performance at any scale. It's a fully managed, multi-region, multi-active, durable database with built-in security, backup and restore, and in-memory caching for internet-scale applications. DynamoDB can handle more than 10 trillion requests per day and can support peaks of more than 20 million requests per second. Here we will use DynamoDB to store the details of the SAP systems such as InstanceId, SID, Host, System Number etc...

2. ARCHITECTURAL OVERVIEW


AWS Config can simplify the process of continuous auditing and assessing the compliance of all SAP systems in your landscape. We also provide additional details to enable email notification when a resource is flagged as non-compliant using Amazon EventBridge and Amazon Simple Notification Service.

2.1 Architecture to Check Compliance for SAP Infrastructure


For checking the Compliance of SAP infrastructure, we are going to leverage AWS managed Config rules to create a solution that will automatically audit and assess our SAP landscape’s infrastructure-related components.


AWS Config will automatically evaluate configuration changes of AWS resources against the rules we define. Information about the status of the compliance can be seen by Administrator on AWS Config Dashboard. If a change is identified as non-compliant then that even will be forwarded to EventBridge which will convert the event details to a meaning full data and provide it to Amazon SNS. So, as per the Amazon SNS configuration Notification email, SMS, HTTP etc. sent to the respective stakeholders.

AWS config can also trigger the remediation action to the respective affected resources to avoid further loss on the compliance.

AWS Config provides over 160 managed rules, these are the rule which are authored by AWS. A list of all the Managed rule can be found here on AWS documents.

2.2 Architecture to Check Compliance for SAP Instance


For checking compliance on SAP Instance, we are going to additionally introduce AWS Lambda, Amazon Secrets Manager and Amazon DynamoDB to integrate the above flow with SAP Instance.


The difference between this and earlier architecture is that we will build a Lambda function that will access Secrets Manager to fetch the credentials for SAP system and the details about the SAP Instance from DynamoDB Table. And use that data for SAP system to fetch the data for the compliance check and provide same data to AWS Config and the evaluation. The evaluation happened at AWS Config and rest of the flow is same as like we have in above architecture.

3. PREREQUISITE


3.1 Prepare SAP system for Soap Calls


To fetch the data from the SAP we have lot of ways such as using SAP SDKs, SOAP calls, DB Queries, OS Commands etc... We need to choose the method according to our requirement. Here we are checking the compliance of the SAP Security Profile Parameter and report for any non-compliant value for the parameter. To do so we here choose SAP SOAP Call of sapcontrol by which we can fetch the profile values with sapcontrol’s function ParameterValue. SAP has already provided very detailed document by which we can use sapcontrol from SOAP Calls.

We also need to make sure that SAP Parameter protectedwebmethods has the appropriate value so that SOAP calls are allowed to SAP for the data. Depending on the type of data is required to be fetched, this parameter needs to be adjusted. Further deep details on the parameter can be checked on SAP Note 2838788 - How to verify if service/protectedwebmethods is recognized by sapstartsrv. In our case we are using default webservice calls hence we will be using SDEFAULT as a value of this parameter.


In addition to the above parameter we also need to check for the value of admin_users parameter which provides us facility to allow limited users to use SAP web methods. For simplification we are going to use only one user for all the SAP systems. To make that possible we need to maintain this parameter in the SAP System as below


To check if our setup to execute SOAP call is working or not, we can go to OS level and use command to check the access using specific username
sapcontrol -prot GSOAP_HTTP -nr <SYSNR> -queryuser -function AccessCheck CheckParameter



3.2 Basic Settings for AWS Config


To use AWS config for specific account, we first need to prepare AWS Config with required settings as per our requirement. To do so go to AWS Config in AWS Console and Click on Get Started


Specify the required settings as below:

  • Resource type to record—We need to specify the type of resources which we need AWS Config to record for the changes such as RDS, EC2 etc...

  • AWS Config Role—We need to specify the Config Role which will be used by AWS Config to fetch the details from AWS resources.

  • Amazon S3 bucket—We need to specify the S3 Bucket Settings to which AWS Config will dump the logs


Click on Next


Now we can specify any Rule if we want to configure along with the settings. As this Rule can be created later hence we can skip this step by clicking Next


Now review all the settings and click on Confirm


Now AWS Config is ready for the further configurations.

3.3 Configure AWS SNS Topic


We need to configure SNS Topic Prior to this configuration as we need to send the notifications when AWS Config get the non-compliant records. To do the same we need to go to Amazon SNS and Enter the Topic Name and then click on Next Step


Now we need to fill out the details about the SNS Topic


Fill out all the other options if required and then click on Create Topic


After Creating the Topic, next we need to create the subscriptions so that Notification can be sent to the concerned Stakeholders


Now we need to select the SNS Topic under which we need to create subscription, then specify the Type of the Notification (we have selected email) and then specify the email add on which we need to send the notification


IF required fill out other details and then click on Create Subscription


We can see that subscription is ready to use with the specified email address


Now Amazon SNS Topic is setup for Notifications

3.4 IAM Roles


3.4.1 Roles required for AWS Config Administration


AWS has default Policy, which is already there with the name of AWSConfigUserAccess, this policy provides user access to use AWS Config, including searching by tags on resources, and reading all tags. This does not provide permission to configure AWS Config which requires administrative privileges. This has only grant read-only access to AWS Config.

For full administration access to AWS Config we need to create below Custom policy and then need to attach that custom policy to the group or directly to the user.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"sns:AddPermission",
"sns:CreateTopic",
"sns:DeleteTopic",
"sns:GetTopicAttributes",
"sns:ListPlatformApplications",
"sns:ListTopics",
"sns:SetTopicAttributes"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"s3:CreateBucket",
"s3:GetBucketAcl",
"s3:GetBucketLocation",
"s3:GetBucketNotification",
"s3:GetBucketPolicy",
"s3:GetBucketRequestPayment",
"s3:GetBucketVersioning",
"s3:ListAllMyBuckets",
"s3:ListBucket",
"s3:ListBucketMultipartUploads",
"s3:ListBucketVersions",
"s3:PutBucketPolicy"
],
"Resource": "arn:aws:s3:::*"
},
{
"Effect": "Allow",
"Action": [
"iam:CreateRole",
"iam:GetRole",
"iam:GetRolePolicy",
"iam:ListRolePolicies",
"iam:ListRoles",
"iam:PassRole",
"iam:PutRolePolicy",
"iam:AttachRolePolicy",
"iam:CreatePolicy",
"iam:CreatePolicyVersion",
"iam:DeletePolicyVersion",
"iam:CreateServiceLinkedRole"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"cloudtrail:DescribeTrails",
"cloudtrail:GetTrailStatus",
"cloudtrail:LookupEvents"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"config:*",
"tag:Get*"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"ssm:DescribeDocument",
"ssm:GetDocument",
"ssm:DescribeAutomationExecutions",
"ssm:GetAutomationExecution",
"ssm:ListDocuments",
"ssm:StartAutomationExecution"
],
"Resource": "*"
}
]
}

3.4.2 Service-Linked Roles for AWS Config


AWS Config uses AWS Identity and Access Management (IAM) service-linked roles. A service-linked role is a unique type of IAM role that is linked directly to AWS Config. Service-linked roles are predefined by AWS Config and include all the permissions that the service requires to call other AWS services on our behalf. We don’t have to manually add the necessary permissions. AWS Config can only define the permission of its service-linked roles. This permission policy cannot be attached to any other IAM entity. AWS Config uses the name as AWSServiceRoleForConfig. This service-linked role will be used to call other AWS services on our behalf.

3.4.3 Roles Required to develop Lambda Function


We can use identity-based policies in AWS Identity and Access Management (IAM) to grant users in our account to provide access to Lambda. Identity-based policies can apply to users directly, or to groups and roles that are associated with a user. We can also grant users in another account permission to assume a role in our account and access our Lambda resources.

Lambda provides AWS managed policies that grant access to Lambda API actions and, in some cases, access to other AWS services used to develop and manage Lambda resources. Lambda updates these managed policies as needed to ensure that our users have access to new features when we are released.

  • AWSLambda_FullAccess – Grants full access to Lambda actions and other AWS services used to develop and maintain Lambda resources. This policy was created by scoping down the previous policy AWSLambdaFullAccess.

  • AWSLambda_ReadOnlyAccess – Grants read-only access to Lambda resources. This policy was created by scoping down the previous policy AWSLambdaReadOnlyAccess.

  • AWSLambdaRole – Grants permissions to invoke Lambda functions.


3.4.4 Execution Roles for Lambda Function


A Lambda function's execution role is an AWS Identity and Access Management (IAM) role that grants the function permission to access AWS services and resources. We provide this role when we create a function, and Lambda assumes the role when our function is invoked. We can create an execution role for development that has permission to send logs to Amazon CloudWatch and to upload trace data to AWS X-Ray.

Here we are using Lambda function to get records from DynamoDB and get credentials from Secrets Manager. So, we only need read access to DynamoDB and Secrets Manager. So, we are using the below policy for the execution.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"secretsmanager:DescribeSecret",
"dynamodb:DescribeContributorInsights",
"dynamodb:Scan",
"dynamodb:Query",
"dynamodb:DescribeStream",
"secretsmanager:ListSecretVersionIds",
"dynamodb:DescribeGlobalTableSettings",
"secretsmanager:GetResourcePolicy",
"secretsmanager:GetSecretValue",
"dynamodb:PartiQLSelect",
"dynamodb:DescribeGlobalTable",
"dynamodb:GetShardIterator",
"dynamodb:DescribeExport",
"dynamodb:DescribeBackup",
"dynamodb:GetRecords"
],
"Resource": [
"arn:aws:dynamodb:*:<Account>:table/*/backup/*",
"arn:aws:dynamodb:*:<Account>:table/*/index/*",
"arn:aws:dynamodb::<Account>:global-table/*",
"arn:aws:dynamodb:*:<Account>:table/*/export/*",
"arn:aws:dynamodb:*:<Account>:table/*/stream/*",
"arn:aws:secretsmanager:*:<Account>:secret:*"
]
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": [
"dynamodb:BatchGetItem",
"dynamodb:ConditionCheckItem",
"dynamodb:DescribeContributorInsights",
"dynamodb:Scan",
"dynamodb:ListTagsOfResource",
"dynamodb:Query",
"dynamodb:DescribeTimeToLive",
"dynamodb:PartiQLSelect",
"dynamodb:DescribeTable",
"dynamodb:GetItem",
"dynamodb:DescribeContinuousBackups",
"dynamodb:DescribeKinesisStreamingDestination",
"dynamodb:DescribeTableReplicaAutoScaling"
],
"Resource": "arn:aws:dynamodb:*:<Account>:table/*"
},
{
"Sid": "VisualEditor2",
"Effect": "Allow",
"Action": [
"secretsmanager:GetRandomPassword",
"dynamodb:DescribeReservedCapacityOfferings",
"dynamodb:DescribeReservedCapacity",
"dynamodb:DescribeLimits",
"dynamodb:ListStreams"
],
"Resource": "*"
}
]
}

3.4.5 Roles required to access Secrets Manager, DynamoDB and EventBridge


For Administrating Secrets manager, DynamoDB and EventBridge, we need full access on these AWS Services. We can also club all access policy of these Services in one and create a role for the same. And later we can assign that policy to group or directly to the user to grant access of these AWS Services

We are now using the below Policy for creating a role that will be used for operations on Secrets Manager, DynamoDB and EventBridge.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"secretsmanager:DescribeSecret",
"dynamodb:DescribeContributorInsights",
"dynamodb:Scan",
"dynamodb:Query",
"dynamodb:DescribeStream",
"secretsmanager:ListSecretVersionIds",
"dynamodb:DescribeGlobalTableSettings",
"secretsmanager:GetResourcePolicy",
"secretsmanager:GetSecretValue",
"dynamodb:PartiQLSelect",
"dynamodb:DescribeGlobalTable",
"dynamodb:GetShardIterator",
"dynamodb:DescribeExport",
"dynamodb:DescribeBackup",
"dynamodb:GetRecords"
],
"Resource": [
"arn:aws:dynamodb:*:<Account>:table/*/backup/*",
"arn:aws:dynamodb:*:<Account>:table/*/index/*",
"arn:aws:dynamodb::<Account>:global-table/*",
"arn:aws:dynamodb:*:<Account>:table/*/export/*",
"arn:aws:dynamodb:*:<Account>:table/*/stream/*",
"arn:aws:secretsmanager:*:<Account>:secret:*"
]
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": [
"dynamodb:BatchGetItem",
"dynamodb:ConditionCheckItem",
"dynamodb:DescribeContributorInsights",
"dynamodb:Scan",
"dynamodb:ListTagsOfResource",
"dynamodb:Query",
"dynamodb:DescribeTimeToLive",
"dynamodb:PartiQLSelect",
"dynamodb:DescribeTable",
"dynamodb:GetItem",
"dynamodb:DescribeContinuousBackups",
"dynamodb:DescribeKinesisStreamingDestination",
"dynamodb:DescribeTableReplicaAutoScaling"
],
"Resource": "arn:aws:dynamodb:*:<Account>:table/*"
},
{
"Sid": "VisualEditor2",
"Effect": "Allow",
"Action": [
"secretsmanager:GetRandomPassword",
"dynamodb:DescribeReservedCapacityOfferings",
"dynamodb:DescribeReservedCapacity",
"dynamodb:DescribeLimits",
"dynamodb:ListStreams"
],
"Resource": "*"
}
]
}

This is end of Part – I, in next part Compliance Check of SAP Systems Using AWS Config - Part II of this article you can find more information about Deployment and Configuration of AWS Config Rules, Remediation, Dashboard and Timelines.
Labels in this area