Export from multiple source databases to speed up migration or Unicode Conversion
Nearly 1.05 TB / hr export speed, overall 510 GB/ Hr processing of data from non-Unicode source to Unicode target!
Performed 1st upgrade, Unicode convert and a 2nd upgrade taking 4.6c non-Unicode MDMP system to ERP60 EHP7 Unicode in a single downtime window of 72hrs!
While the above is only the 2nd fastest migration speed that I have achieved, it still felt good to obtain these impressive numbers from a Windows 2003 & SQL 2005 source databases during my recent 46C to EHP7 Combined Upgrade and Unicode Conversion (CUUC). Yes, source databases as in plural, because I did the export from two copies of the source database to speed up the export rate to a short 6.8hr duration for a 7.14TB size database.
The implementation concept to achieve the above was to copy the production database, at the right time after Unicode preparation steps are done, to a 2nd database server through storage cloning and stand up a VM with the SQL server database with exactly same disk drives as the source. This will lead to a scenario where you will have two copies of the production database in a state where either one can be exported. Let us identify them as source1 and source2. Use the distribution monitor tool to export half the packages from source1 and the remaining half from source2. Of course, one has to choose the packages to be exported from the two sources based on more intelligent criteria than blindly picking half and half. However, for the sake of discussion let us stick to that for the time being. Hence, since you are exporting only half the packages from each database server, you can theoretically cut down the export duration by half. I have implemented this idea on two projects with impressive results. The figure below illustrates this scenario:
You can create more source databases if you have the hardware to further extend this technique and reduce export time.
I have to mention that all other standard tools (Table Splitter, Packages Splitter and package ordering) were also used in the effort to minimize the downtime. Additionally, I also removed 9 longest running indexes from the import process, and built those indexes with more parallelism during post import steps.
Below is the table showing each server’s cores and memory:
DB Used side: 7.14 TB
The approach to get to ERP6.0 EHP7 Unicode/Windows 2012/SQL server 2012 from 4.6C/Windows 2003/ SQL server 2005 in a single downtime window is:
4.6C — upgrade1 to –> ERP6.0 EHP5 NUC — Unicode convert to –> ERP6.0 EHP5 UC — upgrade2 to –> ERP6.0 EHP7 UC
< ————————- CUUC ———————————–>
It was not possible to directly do a CUUC to EHP7 as the source operating system was Windows 2003 and the EHP7 would not run on it.
Above is the shortest path (shortest in terms of number of hops, effort and shorter downtime). Determining this shortest path takes researching PAM, OSS Notes, analysis and comparative evaluation of different other possible paths.
The challenge with the above shortest upgrade path was that the 2nd upgrade occurs during the single business downtime allowed. The uptime technical steps of this upgrade2 to EHP7 need to run while the users are still locked out from before the 4.6C upgrade started. To minimize this overall upgrade2 duration, along with general upgrade tuning options, I chose the option to “Single System (longer downtime, no shadow instance or shadow instance running exclusively)” and achieved a short 13.5 hrs Technical uptime+downtime duration!
Below is a summary of all downtime minimization tools and techniques used, categorized into three areas:
SAP Software Upgrade
• Increased number of processes running R3load, tp, R3trans and uptime
• Chose option Single System-no Shadow Instance (For the 2nd upgrade to EHP7 above)
• Ensured generation of objects during upgrade is switched off
• Minimized time for manual steps (fewer screenshots, more practice and familiarity etc)
• Used faster disk/data storage (used SSD)
• Reduced amount of data through archiving and deletion of unnecessary data
• Created Custom Indexes recommendations made by SAP as part of CQC Assessment
• Used Distribution Monitor tool to use several application servers
• Used table splitter
• Used package splitter
• Sequenced the packages for export/import
• Removed longest running indexes from the export/import process
• Used more than one instance of source database
• Used faster disk/data storage
• Ensured LAN speeds of more than 1 Gbps
• Reduced amount of data
• Hosted Target DB server on a dedicated host
• Hosted Source DB servers on dedicated hosts
Process and Preparation
• Built Application servers in advance of downtime
• Used storage-level snapshots for backups
• Performed as many tasks as possible before downtime
Nice blog. Not many folks venture to do export from multiple source databases. This blog is showing that idea works with good results, written concisely in two pages. Very useful, if you have large database to migrate!