Technical Articles
DMO: introducing the benchmarking tool
This blog introduces the benchmarking tool for checking the migration rate prior to database migration option (DMO). As a prerequisite, you should read the introductionary document about DMO: Database Migration Option (DMO) of SUM – Introduction
Tuning the DMO procedure to shorten the downtime is an important task. With Software Update Manager (SUM) 1.0 SP13, we have an additional, important feature, to be added as the very first technique in the list of means to reduce the technical downtime:
How to tune DMO downtime
- Before the DMO run: use the benchmarking tool to evaluate the migration rate for an existing system
- Before the DMO run: consider to run SUM/DMO on an Additional Application Server (AAS, fka DI) instead of PAS
- During the DMO test run: adapt the number of R3load processes to balance the performance of the SAP application server
- After a (successful) test run: provide the “table duration files” for the next DMO run to optimize the table split mechanism
- If downtime expectations are not met, consider using “downtime optimized DMO”, see
DMO: downtime optimization by migrating app tables during uptime (preview)
Scope of benchmarking tool
- The benchmarking tool offers a fast check for possible migration speed prior to the DMO run
- The source system may continue to run (uptime)
- You can select specific tables, or use a specific percentage of all tables for the benchmarking run
- You can benchmark the export from the source system only, or benchmark the export and the import to the SAP HANA DB
What is the benchmarking tool
- The benchmarking tool is – trara: the SAPup (as part of the Software Update Manager, SUM)
- SAPup is triggering R3load for the export and import, like during the DMO run
- Prerequisite is the download folder, containing the source and target kernel files, like for DMO
- SAPup will not create a shadow repository, and skip other phases as well,
that is why the benchmarking run is so fast (and cannot be used for a real migration/system copy)
How to start the benchmarking tool
- Prerequisite is that no DMO run is active, that means:
if you have started a DMO before, you will have to reset the DMO run, cleanup, and stop all SAPup processes - How to start the benchmarking tool
- SUM 1.0: start the “migtool” option of SUM/DMO with a slightly different URL:
https://<host>:1129/lmsl/migtool/<SID>/doc/sluigui - SUM 2.0: choose the option from the initial dialog of the common SUM procedure
- SUM 1.0: start the “migtool” option of SUM/DMO with a slightly different URL:
- The SL UI will show dialogs to select the benchmarking options, and the number of R3load processes to be used
- After the benchmarking run, a dialog will prompt you to analyze the log file to check the migration rate
- It is not possible to use the benchmarking tool while using the “old DMO UI” (with URL suffix /doc/gui)
Things to keep in mind
- Tuning: you should adapt the number of R3load processes for optimized usage of the application server performance (like in DMO), and then analyze the log file to check the migration rate
- Naming: the R3loads for “UT” are used for the preparation (determine tables for export), the R3loads for “DT” are used for the export (and import, if selected), so UT and DT are no indication for uptime or downtime (concerning the configuration of R3load processes)
- You will have to fulfill the same requirements for the source database software release as for DMO. If you use a lower database software version, you may get cryptic error messages
- Uptime or downtime benchmarking: you may consider to shut down the SAP system to tune the optimum number of R3loads for the DMO downtime run
- Documentation: the benchmarking tool is described in the DMO guide, section “Migration Tools”
- Migtools: “Migration Tools” are a) the benchmarking tool, and b) the standalone table comparison option of SUM. The latter is used for classical migration, and described in the guide of the Software Provisioning Manager (SWPM)
- Benchmarking the export: if you select the option to only export selected data from the source database, SAPup will trigger the R3load with the option “discard”, which will not create any files. This allows you to analyze the speed of the export from the source database without creating files, as the DMO run will not create export files either
- Using the benchmarking tool to do a system copy is not supported by SAP
- [Added on August 29th] For an oracle database, you will have to provide the BRCONNECT tools in the download folder
Boris Rubarth
Product Manager, Software Logistics, SAP SE
Hello Boris,
The Table Comparision part of the MIGTOOLS, is it exactly doing the same as the table comparision step/phase of the SUM DMO? If there is any difference, could you please elaborate on that? Our understanding of the Table comparision on the SUM-DMO is that it compares the number of rows and checksum of the table on the source db with that on the hana db after migration is complete
I understand that table comparision in migtools runs with export to filesyste. What does it export? All the tables mentioned in the list to be compared (is it the full content export - which would mean when you do run it for a large set, we need space accordingly on the file system??)
Hello Chandrakanth,
thanks for the question, I have now tried to answer your question in a new blog:
DMO: table comparison and migration tools
Regards, Boris
Hi Boris,
I am doing Benchmark Tool Migration and in phase EXECUTION ,i am getting below error:
Error:
(RTF) ########## WARNING ###########
Without ORDER BY PRIMARY KEY the exported data may be unusable for some databases
(RDI) INFO: /usr/sap/QTY/SUM/abap/migrate/MIGRATE_00055_EXP.STR has format version 2
(EXP) ERROR: DbSlExeRead failed
rc = 99, table "SEOCOMPODF"
(SQL error 5702)
error message returned by DbSl:
[ASE Error SQL21][SAP][ASE ODBC Driver][Adaptive Server Enterprise]WARNING - Fatal Error 823 occurred at Feb 25 2016 7:36PM. Please note the error and time, and contact a user with System Adminis
trator (SA) authorization.
[ASE Error SQL5702][SAP][ASE ODBC Driver][Adaptive Server Enterprise]ASE is terminating this process.
(DB) INFO: disconnected from DB
I updated DSBL latest patch but no result.Any solution please.
Thanks in advance.
Regards,biral
Hi Biral,
this seems to be an issue with the source DB (SAP ASE). If searching the internet on this error 5702 does not help (usual hint: update DB), you may have to open an incident. This does not seem to be related to the benchmarking.
Regards, Boris
Hi, Boris,
Thanks.But while testing Benchmarking Tool ,I got below error:
Checks after phase MIGTOOL_EXECUTE/SUBMOD_MIG_RUN/EU_CLONE_EXP were negative!
After analyzing the log ,I see "(EXP) ERROR: DbSlExeRead failed rc = 99, table "SEOCOMPODF" (SQL error 5702) error message returned by DbSl: which ist posted above.
I have ASE 15.7.0.136 version.Do you mean i should update my DB to new version?
Thanks in advance.
Regards,Biral
Hi Boris
At last i've a chance to play and tuning with "table duration files". I repeated the DMO twice, first without table duration files and second with table duration files (upgana.xml, MIGRATE_UT_DUR.XML and MIGRATE_DT_DUR.XML).
However, i don't see any significant improvement by optimizing DMO downtime migration using table duration files. I've anaylzed the result of second DMO (with table duration files), and i still see some long exp/imp duration tables started late and eventually lengthen the downtime migration.
May i know normally how much performance do you expect to gain from table durations file??
And also, by using table duration files, is there any possibility that the performance might getting worse?
Hope to hear from you valuable input 😉
Thanks,
Nicholas Chang
Hi Boris,
Any input on above?
Thanks,
Nicholas Chang
Hello Nicholas,
Here are few observations from my side :
- Migration bucket is divided in 3 sections and R3load processes work on these sections in sequence.
- R3load value defined in configuration phase is used to calculate the split value. As per the available hardware resources, max value of possible R3load can be mentioned in configuration phase to achieve max. possible split factor. When you enter into execution phase, reduce the R3load value and increase as per CPU availability.
- Downtime can be reduced if the big table imports is started in parallel in the very beginning. We can let Boris reply on this part.
Regards,
Ankit
Hi Boris,
Just wondering since DMO is an Upgrade+Migrate procedure, does the BenchMarking Tool includes the Upgrade as part of the benchmark result? Based on the DMO Guide, the benchmark focuses on the export and import portion and only the target Kernel is required. Appreciate your insight on this.
Thanks,
Vj
Accenture Innovation Centers for SAP
Hello VJ,
Based on my experience with benchmark tool, it is only to understand the runtime of export only or export+import to tune R3load settings. Component upgrade is not considered as part of benchmark.
Regards,
Ankit
Hi Ankit,
Thanks for the confirmation, we suspected that much. Are there any good rule of thumb to consider in estimating the duration of the Upgrade Phase during DMO?
Thanks!
Vj
Accenture Innovation Centers for SAP
Hello VJ,
As per my experience, most of the time is consumed in execution phase when application data is migrated from source to HANA. If you use benchmark tool with export+import option, it will really copy the data to HANA schema and later will delete this schema. This process would generate migrate_DT logs which can be analyzed for the runtime. Unless you trigger any run, it is difficult to estimate the runtime.
Regards,
Ankit
Hi Vj,
Long Runtime upgrade process in DMO downtime (execution) phase would be TABIM_UPG, and the runtime would depends on no of objects being imported based on number of SPS loaded, number of R3trans configured and etc. Do a search on SMP on phase TABIM_UPG for more info.
Thanks,
Nicholas Chang
Hi Nicholas,
Thanks for the tip! The scn wiki for TABIM_UPG pretty much explains this phase and provides good reference sapnotes. We'll incorporate this together with the DMO benchmarking tool.
Regards,
Vj
Accenture Innovation Centers for SAP
Hello Boris,
I ran Benchmarking tool on DMO using migtool URL. That tool is really fantastic and I appreciate SAP innovative support on DMO.
I found few items like which SAP could consider for live environment.
As I already shared logs benchmarking result with SAP on feedback page.
1. We only get UPANA.XML as result of table export/import.
We could generate some pre calculated report, like total size of DB picked by benchmarking and post import size on HANA.
(Since we need to find these size manually from EUMIGRATESIZES.LOG file)
Example we have calculated 80 GB transferred on pipe to HANA DB sized 2 G on BW system sized 1.75 TB around 1 hour 70 min with 35 R3load in DT with selection 'operate on all tables' %10 of DB size and size of largest table in sample 1%
2. Cleanup worked very well, but again I have to uncar SUM tool to start DMO task. Which should mention clearly in guide.
I'll share my lesson learned from this DMO, I am finding very interesting points in SP14.
Not sure SAP still reflecting SP13 (dev) on top of DMO screen 🙂
Thanks again,
Amit Lal
Principal -HANA Consultant
Hello Amit,
If you still want to verify the SUM version, you can check summanifest.mf file in SUM directory. You should find below lines :
release: 1.0
support package: 14
patch number: 1
Regards,
Ankit
Find helpful thankyou. 🙂 .
Hi Boris,
Any new features/improvement in latest SUM SP17 especially for DMO and Benchmarking tools.?
Regards,
Amit
Hi AMIT Lal,
no news on benchmarking.
News are listed on Short history of DMO
Regards, Boris