SAP Migration to GCP – What it takes for a Successful Migration
This blog is for those interested to know more about migrating SAP workloads to GCP. There are plenty of reasons for migrating IT workloads to a Cloud provider. Organizations do that to achieve or keep their leadership position in the market, continue innovation, have an edge over the competition, and improve their bottom line. You may be part of the Cloud Journey already or hearing a lot about migrating IT workloads to a Public Cloud and may have questions such as the below on migrating SAP workloads to the GCP. This blog tries to answer those questions and help with a successful migration.
- Can I migrate my workload to GCP?
- How difficult is it to migrate to GCP?
- What would be the cost of Migration? What would be my cost of operation in the Cloud? Is it feasible and will I have a good ROI?
- How to reduce my cost of migration and cost of operation in the Cloud?
Let us discuss factors involved in each of the above questions:
- Can I migrate my workload to Cloud?
- You can migrate to the Cloud only select supported versions of SAP products and to a list of certified GCP Hardware.
- The certification list keeps growing as both SAP and GCP continue to test and issue new certifications. SAP and GCP related notes are periodically updated.
- SAP’s PAM provides a list of OS/DB that are supported for each of the SAP applications and versions. You can start with PAM and further narrow down with the below GCP SAP notes to check if an application can be migrated to the Cloud or not.
- SAP Notes that are relevant for GCP Migrations are:
|Cloud service provider or IaaS provider||Support requirements||Supported SAP applications||Instance types supported with SAP HANA|
|Google Cloud Platform||2456406||2456432, 3000343||SAP HANA Directory|
Source: SAP Master note: 1380654 – SAP support in IaaS environments
- Only Applications supported per notes above can be migrated to the GCP.
- Only RHEL/SLES and Windows operating systems are supported in GCP. Versions of RHEL and SLES that have been modified specially for SAP such as SLES for SAP and RHEL for SAP are available to use with DB instances.
- Databases supported are DB2, ASE, MaxDB, SAP LiveCache on RHEL and SUSE LINUX and DB2, SQL Server, ASE, MaxDB, SAP LiveCache on Windows are supported. Oracle is supported through BMS in GCP. Bare Metal Solution (BMS) is a managed solution that provides purpose-built HPE or Atos bare-metal servers in regional extensions that are connected to Google Cloud by a managed, high-performance connection with a low-latency network fabric.
- SAP application solutions that are certified by SAP to run on Google Cloud are: https://cloud.google.com/solutions/sap/docs/certifications-sap-apps
- HANA specific memory-optimized VMs are: https://cloud.google.com/solutions/sap/docs/certifications-sap-hana
- Server Configurations such as 2-ultramem-672-metal with 336 CPUs and 18 TB Memory or o2-ultramem-896-metal with 448 CPUs and 24 TB are available in BMS. For more details on BMS, you can check at: https://cloud.google.com/bare-metal/docs/bms-planning
- Support for Older releases such as R/3 46C etc. Though it is possible to migrate older releases to Cloud it may not be supported by SAP. Besides, you would need media and a compatible OS/DB that is provided by the provider. If it is not currently supported in DC, the same applies in Cloud too. If it is a display-only environment – you may be able to host in the Cloud without support.
- Access to resources is controlled through Identity and Access Management
- The largest Memory optimized Single VM types that can be hosted in GCP is of 12 TB or up to 24 TB in BMS:
- In BMS you can take it even higher up to 24 TB:Source: GCP websites as of Sep 2021
- Scale out can be up to 16 nodes for OLAP workloads with m2-megamem-416 (5888 GB) above or 4 nodes of m2-ultramem-416 (11776 GB)
- Please check https://cloud.google.com/solutions/sap/docs/certifications-sap-hana for the latest info on max capacity available on GCP
- How difficult is it to migrate to Cloud?
Factors that decide the difficulty of migrations are:
- DB Size, DB type, Downtime available for Production Cutover, if it is a homogeneous or heterogeneous migration, any OS/DB upgrade is required, SAP application upgrade is required, and Unicode conversion is required. Smaller environments with the latest SAP versions are easier to migrate, else depending on Source and Target versions, migration needs to be done in steps with upgrades before or after or combine like DMO for HANA.
- Heterogeneous migrations are more complicated as it involves in SWPM export from Source and import into Target.
- Homogeneous migrations are smoother and simpler involving just backup and restore or lift and shift or using migration tools that are applicable and provided by the Cloud providers.
- For large DBs with downtime restrictions, it must be done by replication methods such as HSR or RMAN in the case of Oracle or any other DB-specific tools. There are license paid tools such as Golden Gate that can provide better options for migration of Oracle databases but adds to the cost of migration.
- For heterogeneous migration, Parallel Export and Import in SWPM combined with table splitting, package splitting, order by options will speed up the Migration. That needs to be tested extensively with multiple iterations/optimizations before going to Prod migration golive.
- The level of integrations between applications and the number of applications will determine if migration can happen as a big bang or in groups/waves. Integrations that need to be checked are within SAP systems, with other Non-SAP systems, Other systems in DC (if not migrated), and integration with SaaS Products including SAP’s SaaS products.
- https://cloud.google.com/solutions/sap/docs/checklist-sap can help with your migration needs.
- What would be the cost of Migration? What would be my cost of operation in the Cloud? Is it feasible? Will I have a good ROI?
Factors that affect the cost of migration are:
- VMs type chosen (based on HW architecture)
- CPU – Number of CPUs
- Memory – Amount of Memory
- Local or Persistent Storage type – Standard, SSD etc.
- Cloud Storage for backup – Standard, Nearline, Coldline and Archive
- Specialized high throughput and IOPS requirements
- Standard sizes that are offered and Custom sizes
- Selection of VM types (based on different Architecture) that is suitable for different purposes such as Non-Prod and Prod environments or Sandbox/Dev/QA or Prod systems.
- To understand GCP Machine types please check: https://cloud.google.com/compute/docs/machine-types#m2_machine_types
- In general memory-intensive, HANA systems running on SSD for persistent storage are most expensive. Besides very large CPU intensive and GPU-based systems are also expensive. Expensive ones would be generally of higher Memory, CPU with SSD storage.
- If you are moving the DB to HANA it is best to control the size of the DB before moving to HANA.
- Fast growing DBs (year-on-year growth) will also contribute to the cost.
- IaC – Infrastructure as Code for VM builds (using Deployment Manager or Terraform) and Automated SAP install for faster, smoother, cost effective implementations implementation.
- How to reduce my cost of migration and cost of operation in the Cloud?
- Choose the Right size for Target VMs. Smaller if not sure. Inputs can be from EWA reports, Server Utilization (CPU and Memory), and DB growth info.
- Choose VM type (based on different Architecture) for Sandbox/Dev/QA or Prod Systems. Prices vary based on type and performance.
- Choose the Right Storage type SSD or Standard. Ex. For Prod SSD and Non-Prod – Standard HDD or per your specific requirements
- Choose the Right Cloud Storage types for backups or other purposes.
- Move non-active data to archive or cold line storage if you access them rarely (since Access cost increases from Standard to Archive storage). Access speed is the same across different Cloud Storage types.
- It is easy to Scale-Up if you are running on the same VM architecture with just a restart.
- Moving to a different architecture is also not difficult if downtime allows.
- A single HD drive for all filesystems with GCP is unique. Larger the drive, higher throughput, and IOPS provided. You can leverage this feature and save the cost of storage.
- Pay for the time you use only (pay-as-you-go). Schedule Stop and Start for non-critical systems that are not in use. When the system is shut down, you pay only for Storage used by the VM. However, there are deep discounts for Committed use.
- Choose the right OS/DB as license cost varies. Standardize as applicable.
- Use Images/snapshots to rebuild applications any time and save cost
- Use Calculators provided by GCP and compare the Cost of DC On-Prem yearly and hosting applications in Cloud. GCP Calculator: https://cloud.google.com/products/calculator
- There is also a google sizing template sheet in google docs that you can get from GCP contacts which will be very useful to understand the overall cost of infra along with the timelines.
To summarize some of the major benefits that can bring by migrating to the Cloud are:
- Build and Drop VMs any time – Faster VM builds for Implementation/Staging/Mock Runs, PoCs, Short term Testing
- Scalability – Scale-Up when required with just a restart. Scale-out options as applicable.
- Better Security control (HW-level Security goes with the Cloud Provider) and Application-level Security stays with you for IaaS
- Better RTO and RPO, Uptime, Network performance
- Implementing a Global Solution is easier
- Enable move from IaaS to PaaS, AaaS and Serverless Architecture such as Kubernetes, App Engine and Cloud Functions/Run where applicable
- Leverage DW options such as BigQuery and other Managed DB services offered by the Cloud Provider. Access to Innovations in the areas of Artificial Intelligence & Machine Learning.
I think we have discussed everything about migrating SAP workloads to GCP. If you see anything is still missing, let me know. I will answer in my next update. Enjoy your journey to the Cloud!
Thanks what is the role of SAP Security consultant here in overall migration project?
Hi Yogita, Responsibility of teams involved in migration including the SAP security consultant is to ensure that all that works before migration works the same way as expected or better after migration. As the applications are hosted in a new environment (different location) areas that need to be reviewed and validated such as communication to GRC, Single sign-on setups, ability to continue to develop, manage roles and user assignments etc. So, it is important to understand which applications are being migrated first, what gets changed in that process, and how it influences/affects remaining systems in the landscape needs to be analyzed and come up with the plan and implement changes at the right time. You will be also checking on how the migration affects application connectivity from other SAS applications and interfaces for which used id/passwords are controlled by Security teams.