Skip to Content
Author's profile photo Former Member

System copy/refresh scenario comprising SLT replication to HANA

SLT is one of several solutions that have revolutionized Real-Time Trigger based replication to HANA. However given the challenges of modern day Enterprise requirements, how does this work in a system copy/refresh scenario without having to recreate the entire replication/initial load? A stringent search through marketplace or SCN does not yield much assistance. However broaden your search and you might come across something interesting.

The key to the above requirement lies in a non-standard solution that SAP has provided in note 1933605. This note was designed for a migration scenario, as you can see from the below excerpt.

“You are running SAP Landscape Transformation Replication Server for replicating data from a SAP system and you are planning to migrate the database of your source SAP system to another database vendor.

After the migration you would like to keep runing [sic] the replication of the tables without a need of performing new initial load.”

The main reason that such a solution is required is described in the note as per below excerpt.

“After the successful migration of your source SAP system to the new database from a different vendor, you would have the following status with your SLT replication:

– the SLT triggers will be not migrated, so they will not exist in your source system

– the SLT logging tables will be migrated but their structure ( the data type of field  IUUC_SEQUENCE ) will not corespond [sic] to the new database.“

Now how does this relate to a system copy/refresh procedure where the database and OS are probably the same? Think broadly and you will realize that in a migration or system copy scenario, you are transferring content from one database to another and a homogeneous copy is a subset of a heterogeneous copy. However in a homogeneous copy using database specific procedures, the triggers are copied along with the tables.

SAP therefore provided four non-standard programs in note 1933605

  1. ZMM_DEL_LOG_TAB – “The report will delete the old (obsolete) logging tables. Also will reset ‘active trigger’ flag in SLT, as the triggers in your source system are not active any longer.”
  2. ZMM_CREATE_LOGTAB_TRIGG – “The report will recreate the logging tables and the triggers. Also will set the corresponding flags into the SLT system.”
  3. ZMM_REDEFINE_OBJECTS – “The report will redefine and regenerate the replication migration objects, and the DDIC change of the logging tables will be taken into account. Additionaly [sic] the report will try to keep the old IDs of the migration objects in order that the latency information will be kept and the setup of any parelel [sic] replication will be kept.”
  4. ZMM_RECREATE_OBJECTS – “The report is only for the tables for which the migration objects couldn’t be redefined with the report ZMM_REDEFINE_OBJECTS. In “LTRC transaction -> Data Transfer Monitor” the “generated” and “defined” flag are not set and the flag “failed” is set.”


So how do we get rid of the triggers after a homogeneous copy? SAP provides a standard transaction IUUC_REMOTE which can be used delete triggers as well as logging tables in a source system.


Now to put everything together in a procedure format we get the following – 

  1. System copy is complete, all post refresh activities are complete, proper RFC’s have been set and master job for target MT_ID is stopped in SLT server.
  2. Run IUUC_REMOTE in source system to delete all triggers. Delete all logging tables as well if you have a shared SLT server.
  3. If you have a shared SLT server do not run ZMM_DEL_LOG_TAB as this will destroy your source system replication. If separate SLT servers you can run ZMM_DEL_LOG_TAB, however this will not do much as the MT_ID of source will not exist in the target SLT server. In short this is only required if you perform a migration and continue replication with same MT_ID through the same SLT server.
  4. Run ZMM_CREATE_LOGTAB_TRIGG in the SLT system with target MT_ID as this should now recreate the triggers and logging tables and set corresponding flags.
  5. Run ZMM_REDEFINE_OBJECTS in the SLT system with target MT_ID and this should redefine all internal SLT objects with the recreated triggers and logging tables.
  6. Run ZMM_RECREATE_OBJECTS in the SLT system only if ZMM_REDEFINE_OBJECTS fails for certain objects/tables.
  7. If no issues occur then start the master job for target MT_ID and replication should continue without erasing the table contents in HANA.

The above steps can be adapted to almost any system copy/refresh scenario comprising SLT replication. The only drawback with this method is there can be data in the source system which does not exist in the target HANA database. However you can simply reload just that specific table from HANA studio and everything should be in sync. At least you don’t have to reload everything right?

Maybe in the future SAP might provide a standard procedure but for now maybe this can suffice.

——–Note from SAP——————

The described way is possible and thank you to Sharan, but there is also now a guide available:

Resume Replication without initial load after System Refresh: https://scn.sap.com/docs/DOC-68026

Assigned Tags

      24 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Former Member
      Former Member

      We had ABAP Syntx errors with the programs after downloading them from the Note. When we checked with SAP, they said its not supported and we had to delete the existing configuration when we do the system refresh

      Author's profile photo Former Member
      Former Member

      Guys,

      I have same issue and I opened OSS message to SAP and also informed about Syntx issues. They corrected syntx issues and reloaded them into OSS note 1933605. Right now new programs are available in OSS note: 1933605

      we need some time to run them and let you know if I will see any issues after we will use them.

       

      Thanks


      Author's profile photo Former Member
      Former Member

      Venkat

       

      I checked the note and dated back to 30.10.2013. Also there is new versio availabe. Can u check and confirm if you have created the programs from the note.

       

      Mahesh Shetty

      Author's profile photo Former Member
      Former Member

      Following is the response from SAP just 2 hours ago.

       

      I've checked and I've seen now that for two of the four reports

      attached to note 1933605, there are indeed syntax errors if the recent

      support packages of DMIS 2011 are used. I have now reworked the coding

      of these reports (ZMM_CREATE_LOGTAB_TRIGG and ZMM_DEL_LOG_TAB). So if

      you now download the new versions, no syntax errors should occur any

      more. So you should be able to proceed according to the guideline

      given in this note

       

       

      thanks

      Venakt

      Author's profile photo Former Member
      Former Member

      Thanks Venkat. We are going to download the new programs and see if the syntax errors have been fixed.

      Author's profile photo Former Member
      Former Member

      Please let us know if you will see any issues and mine will not happens until next week as our target system is under build.

       

      Thx

      Venkat

      Author's profile photo Former Member
      Former Member

      Venkat

       

      The programs are installed successfully without any syntax errors. We have a refresh coming up next week for one of the test system and we will use programs to see if they work as expected. This is going to be a great benefit if they work as we dont have to reload the initial data for all the tables. We have some tables which 1.4 Billions entries and almost takes 2-3 days to get the initial load.

       

      Also I am worried about some data loss if we use this configuration. These programs or SLT should be enough smart to understand where the data replication was stopped and conitnue to load the remaining data from that point. I am still not familiar with the complete SLT architecture and try to learn everyday.

       

      I will keep you posted with my updates next week.

       

      Mahesh Shetty

      Author's profile photo Former Member
      Former Member

      Hi Mahesh ,

       

      we also want to use this procedure ..so just like to get confirmation , its work fine in your case or not ?

      Author's profile photo Former Member
      Former Member

      Gagan

       

      Nope. it didnt work properly. We had to delete the whole configuration and load the data back to Hana DB. This will not work in the regular System refreshes. I am still waiting for SAP to give us a good technique to work with the system refreshes.

       

      Mahesh Shetty

      Author's profile photo Former Member
      Former Member

      Hi Mahesh,

       

      I tried this procedure, We were able to load these programs without any syntax errors. But it did not work well in our scenario as well. 

       

      Regards,

       

      Guriq

      Author's profile photo Saritha Koroth
      Saritha Koroth

      Hi Mahesh,

       

      Any idea on how SLT configurations can be dealt with during system refreshes?

       

      Regards,

      Saritha.

      Author's profile photo Laurent THIBERT
      Laurent THIBERT

      Hello,

       

      I am thinking of using the report ZMM_REDEFINE_OBJECTS.

       

      I believe there are missing statements like the lines that I am planning to add (in bold below):

       

      * reset the generated flag

         UPDATE dmc_mt_tables

         SET generated = '-'

             in_process = '-'

             failed     = '-'

             loaded     = '-'

         WHERE id = p_mt_id

      AND tabname IN s_table.

       

      ...

      ...

       

      * reset the failed flag

         UPDATE dmc_mt_tables

         SET generated '-'

             in_process = '-'

             failed     = '-'

             loaded     = '-'

         WHERE id = p_mt_id AND failed = 'X'

      AND tabname IN s_table.


      Any thoughts? Any return of experience?


      Regards,

      Laurent

      Author's profile photo kishore chillamcherla
      kishore chillamcherla

      the note 1933605 and its Z reports relevant only when you migrate the system to a new hardware or new OS, and the pdf attached to the note will work only if you source and target are got refreshed ie., your source ECC and HANA systems should be refreshed at the same time.

      Author's profile photo Saritha Koroth
      Saritha Koroth

      Tobias,

       

      If we have refreshes being carried out  for a landscape like below -

      ECC prod -> SLT prod->Hana prod

      ECC qual -> SLT qual-> Hana quality

      from ECC prod to ECC quality and HANA prod to hana quality what will be the impact on the SLT configurations here? will they continue to run or would we be required to re-create the configurations ?

      Author's profile photo Former Member
      Former Member
      Author's profile photo Saritha Koroth
      Saritha Koroth

      thanks Amar for your prompt response.

      Can you clarify one thing please->

      in such setups do we have to maintain a common configuration name(which also becomes our schema name in Hana) across all landscapes?

      is there any best practice document for such landscapes ?

       

      Regards,

      Saritha

      Author's profile photo Former Member
      Former Member

      Hello Saritha,

       

      there is no best practice as such for these approaches of refresh, but the best practice that you can follow is creation of the schema mapping thing in HANA so that even if there is a change in HANA schema name we can just change the logical pointing of thee schema in mapping and everything will work as is.

       

      Amar Ghuge

      Author's profile photo Former Member
      Former Member
      Blog Post Author

      Why do you need to do a schema mapping?

      In slt and ecc, change dbco to the schema that you want to use in the refreshed hana system.

      Ensure rfc from slt to ecc works and you should be all set in terms of communication.

      You would still need to follow the guide provided in the link to try and avoid initial load.

       

      Cheers,

      Sharan

      Author's profile photo Former Member
      Former Member
      Blog Post Author

      Not necessary to maintain same schema name in all envs, but when you refresh prod to qas, schema of qas essentially gets the same name of prod.

       

      Just update dbco with the schema name and hana connection and you should be able to communicate with hana.

       

      But in terms of maintaining replication you need to update the table rs_replication_components with the proper host and sid info of qas as mentioned in the guide and the rest of the reset steps.

       

      Cheers,

      Sharan

      Author's profile photo Saritha Koroth
      Saritha Koroth

      Thank you Amar and Sharan! would keep this in mind while setting up the SLT configurations.

       

      Regards,

      Saritha

      Author's profile photo Former Member
      Former Member

      Hello,

       

      We followed SCN  http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10c14106-564e-3310-cf86-be80a6c27996?QuickLink=index&overridelayout=true&60550448941665

      but still it does not work and we still have to perform initial load.

      Is there any additional step that needs to be performed for this to work?

       

       

      Architecture Setup:

       

       

      Production:

      Source system SID PRD

      SLT System : PT1

      Tarrget HANA sidecar system SID : PR1

       

       

       

       

      Quality:

      Source system SID QRD

      SLT System : QT1

      Target HANA sidecar system SID : QR1

       

      In this case QRD gets refreshed from PRD and QR1 gets refreshed from PR1. The backup is taken around same time to keep them as close as possible but not exactly at the same instance of time.

      The MT ID is different between source and target.

      Author's profile photo Former Member
      Former Member
      Blog Post Author

      I don't think it should matter if the MT ID is different, because per the steps you are deleting the triggers and logging tables created by previous MT ID and creating new ones per the new MT ID.

       

      I think the key requirement as listed in the guide is - "If the data in source system and the data in the target system is in sync for the relevant tables".

       

      Have you tried stopping the replication jobs prior to taking a backup? Although I am not sure how feasible this approach would be for a live production system.

       

      Another requisite in the guide - "If the source and target systems are part of a system landscape where data is being replicated by SAP LT Replication Server, then you must ensure that all delta data is replicated. In the SAP LT Replication Server system, you can check the status of any unprocessed records by using the expert function View Unprocessed Logging Table Records in the SAP LT Replication Server Cockpit (transaction LTRC)"

       

      Per this, if you have any records in logging tables, then it automatically considers it as not in sync (since you delete logging tables after refresh).

      Author's profile photo Former Member
      Former Member

      http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10c14106-564e-3310-cf86-be80a6c27996?QuickLink=index&overridelayout=true&60550448941665

       

      Hi Amar,

       

      Is the link opening for you all? I am getting an Internal Server Error when I click on the link:

      Author's profile photo Rama Shankar
      Rama Shankar

      Nice Blog - thanks Guys!