I hope that ten episodes of SAP on Azure series gave you enough reasons to host the system in the cloud platform. But so far, I didn’t touch the topic of migration. In the DMO with system move post, I have presented one of the possible ways to move the workload to Azure, but there are many more options available. Depending on your requirements, you can use different tools and methods to perform the migration seamlessly and without impact to the end users.
We can distinguish two main migration scenarios:
- Homogenous system copy – when the database and operating system stay the same
- Heterogeneous system copy – when the database or operating system or both are changed
But the above is just a high-level overview. When planning the migration, it is required to consider many more factors. Is the system supported on the new platform or does it require an upgrade? How long downtime can I afford? How to transfer the data? Let’s check the possible tactics and find the correct one for you!
Export / Import
The classical way of performing the migration. All the data stored in the database are exported to the filesystem and then imported into the target environment. The migration process is highly customizable, it can be used for heterogeneous migrations, and gives you the opportunity to influence the execution process and minimize the downtime. Exported data consume around 20-40% of the uncompressed database size and therefore it is one of the best methods to migrate your system to a public cloud, where the data transfer speed is an important factor. The process can be combined with the Unicode conversion.
Database backup / restore
In my opinion copying the database is the easiest way to perform system migration, however, it won’t give you as much flexibility as the classical approach. It’s a great method when the database and operating system in source and target system is the same (homogeneous system copy). The data footprint is quite large, so a solid internet connection is required – I recommend using ExpressRoute which is a dedicated link to Azure, but you can also use
Database backup / restore with log shipping
An enhancement to database backup and restore method to minimize the downtime in case of large databases. In this scenario the database is transferred ahead of time of migration and a database log replication is set up between source and target environment. When the source system is shut down you can almost immediately start the target environment as all changes were constantly replicated.
SUM/DMO with system move
The pure DMO was designed to simplify the database migration process to SAP HANA. It is very useful when it is required to perform the database migration, system upgrade and Unicode conversion at the same time. Running the process with system move option gives the opportunity to use cloud platform as the target environment, however, in that case, the downtime optimization feature is currently not supported. DMO cannot be used if the source database is the same as target.
There are a few restrictions when using this tool. My recommendation is to carefully check whether this method is the best choice for you.
If the system is currently running in a virtualized environment, and there is no requirement to change the database or operating system, you can use this method to copy the underlying persistence and build Azure VM using the copied disk. In theory, the easiest way to move the system to the cloud, but in my opinion, it is much better to use the opportunity and re-build the environment using one of the previously described methods. It is required to have an updated operating system to ensure the system will boot from the disks. Can be combined with Azure Site Recovery to minimize downtime but in that case, additional attention is required to ensure the database consistency.
The above list is not complete. Some database vendors created additional tools to make the migration easier. Always ensure that the process is supported by SAP and Microsoft.
An important factor in system migration to Microsoft Azure is the data transfer between the legacy data center and the cloud platform. A dedicated, high bandwidth connection is recommended to move the files quickly. In classical migration, it is possible to use FTP server to parallelize database export and import which can shorten the time total execution process. In all other cases, the best results can be achieved using dedicated Microsoft tool AzCopy. Copying multiple files at once ensures the highest bandwidth consumption. Some of the databases allow splitting the backup file into multiple volumes which decreases the time required for backup execution and the data transfer.
In case of very large datasets, another option is using Azure DataBox service. The external device is shipped between the data center and Azure to move the data when an online transfer is not possible or would take a very long time. Usually, the operation can take up to 10 days, therefore, I won’t recommend using it with SAP solutions due to very long system downtime.