Skip to Content
Requirement may come up to add new fields to LO cockpit extractor which is up & running in production enviornment. This means the extractor is delivering daily deltas from SAP R/3 to BW system .  Since this change is to be done in R/3 Production system, there is always a risk that daily deltas of LO cockpit extractor would get disturbed. If the delta mechanism is disturbed (delta queue is broken) then there no another way than doing re-initialization for that extractor. However this re-init is not easy in terms of time & resource. Moreover no organization would be willing to provide that much downtime for live reporting based on that extractor.  As all of us know that initialization of LO Extractor is critical, resource intensive & time consuming task. Pre-requisites to perform fill setup tables are – we need to lock the users from transactional updates in R/3 system, Stop all batch jobs that update the base tables of the extractor. Then we need to schedule the setup jobs with suitable date ranges/document number ranges.   We also came across such scenario where there was a requirement to add 3 new fields to existing LO cockpit extractor 2LIS_12_VCITM. Initialization was done for this extractor 1 year back and the data volume was high.   We adopted step by step approach to minimize the risk of delta queue getting broken /disturbed. Hope this step by step procedure will help all of us who have to work out similar scenarios.  Step by Step Procedure:-   1.Carry out changes in LO Cockpit extractor in SAP R/3 Dev system. As per the requirement add new fields to Extractor.  These new fields might be present in standard supporting structures that you get when you execute “Maintain Data source” for extractor in LBWE. If all required fields are present in supporting structure mentioned above then just add these fields using arrow buttons provided and there is no need to write user exit code to populate these new fields. However if these fields (or some of the required fields) are not present in supporting structures then you have to go for append structure and user exit code. The coding in user exit is required to populate the newly added fields. You have to write ABAP code in User exit under CMOD & in Include ZXRSAU01.  All above changes will ask you for transport request. Assign appropriate development class/Package and assign all these objects into a transport request.  2.Carry out changes in BW Dev system for objects related to this change. Carry out all necessary changes in BW Dev system for objects related to this change (Info source, transfer rules, ODS, Info cubes, Queries & workbooks). Assign appropriate development class/Package and assign all these objects into a transport request  3.Test the changes in QA system.  Test the new changes in SAP R/3 and BW QA systems. Make necessary changes (if needed) and include them in follow-up transports.  4.Stop V3 batch jobs for this extractor.  V3 batch jobs for this extractor are scheduled to run periodically (hourly/daily etc) Ask R/3 System Administrator to put on hold/cancel this job schedule.  5.Lock out users, batch jobs on R/3 side & stop Process chain schedule on BW.  In order to avoid the changes in database tables for this extractor and hence possible risk of loss of data, ask R/3 System Administrator to lock out the users. Also batch job schedule need to be put on hold /cancel. Ask System Administrator to clear pending queues for this extractor (if any) in SMQ1/SMQ2. Also pending /error out v3 updates in SM58 should be processed. On BW production system the process chain related to delta Info package for this extractor should be stopped or put on hold.  6.Drain the delta queue to Zero for this extractor. Execute the delta Info package from BW and load the data into ODS & Info cubes. Keep executing delta Info package till you get 0 records with green light for the request on BW side. Also you should get 0 LUW entries in RSA7 for this extractor on R/3 side.   7.Import R/3 transports into R/3 Production system. In this step we import R/3 transport request related to this extractor. This will include user exit code also. Please ensure that there is no syntax error in include ZXRSAU01 and it is active. Also ensure that objects such as append structure is active after transport.   8.Replicate the data source in BW system. On BW production system, replicate the extractor (data source).  9.Import BW transport into BW Production system.  In this step we import BW transport related to this change into BW Production system.  10.Run program to activate transfer rules  Execute program RS_TRANSTRU_ACTIVATE_ALL. Enter the Info source and source system name and execute. This will make sure that transfer rules for this Info source are active   11.Execute V3 job Manually in R/3 side Go to LBWE and click on Job Control for Application area related to this extractor (for 2LIS_12_VCITM it is application 12). Execute the job immediately and it should finish with no errors.  12.Execute delta Info package from BW system  Run delta Info package from BW system. Since there is no data update, this extraction request should be green with zero records (successful delta extraction)  13.Restore the schedule on R/3 & BW systems Ask System Administrator to resume V3 update job schedule, batch job schedule and unlock the users. On BW side, restore the process chains schedule.  From next day onwards (or as per frequency), you should be able to receive the delta for this extractor with data also populated for new fields.   
To report this post you need to login first.

15 Comments

You must be Logged on to comment or reply to a post.

  1. Rohini Bansal
    Pradip,

    That was very clear explanation. Good Blog.

    But I have a question to ask you. Here what method do you suggest to populate the value of new fields for the old records which are already loaded to Info cube/ODS.

    I hope I’m clear with my question.

    Regards,
    Rohini

    (0) 
    1. Pradip Patil Post author
      Rohini

      This is really good question.
      To populate the value of new fields for old records here is what I had done –

      Created a generic extractor consisting of new fields plus key fields of ODS (in order to maintain uniqueness of records & also compatibility with ODS)and created update rules from this with existing ODS. Loaded the data using this Extractor and I had values of new fields for old records populated.

      Hope this answers to your question.

      Regards
      Pradip

      (0) 
      1. Rohini Bansal
        Thanks Pradip for the qucik reply.

        Now my next Question may sound silly to you but still don’t want to stop myself from asking.

        At what stage will you perform the step to populate the values for the new fields. I mean you will first populate the values of the new fields or after first delta(after the changes). 

        Regards,
        Rohini

        (0) 
        1. Pradip Patil Post author
          That’s not a problem Rohini!
          In my opinion,one should populate the values of new fields for old records AFTER first delta.
          When you add new fields, your primary focus will be on running the delta & ensure it is intact.
          Populate the values of new fields for old records is definitely not that crticialso it can wait for some more time.
          Hope this helps you.

          Regards
          Pradip  

          (0) 
      2. Pradeep Gupta

        HI Pradip,

        Just wanted to know, why cant we do repair full loads of the old data to populate the corressponding new fields added ? Being a DSO it wud just overwrite the KFs.

        Just a thought.

        Pradeep.

        (0) 
  2. Hi Pradip,
    nice detailed explanation, even if we run the risk to have some redundance on this topic (among the other things, already described in this previous weblog, LOGISTIC COCKPIT – WHEN YOU NEED MORE – First option: enhance it !, in “Enhance it, but mind the queue !” section).
    Just an unbiased suggestion: maybe it would be better to explain also some “underground” concepts, as the hash solution (refer to OSS Notes 835466 ‘Using the repair mode of the hash solution’ or 834897 ‘Avoiding inconsistent data with extract structure’)never discussed before in forums or weblogs activity…
    Anyway, I wish you go on on your good job and improve it great in the future!

    Bye,
    Roberto

    (0) 
  3. ragu ram
    Hi pradip,
    Its really a good blog.
    I would like to thank you for such a wonderfull step by step explanation on behalf of SDN members.
    Regards
    ragu
    (0) 
  4. Xiang Ji
    Dear Pradip,

    Thanks for the blog first!

    Somebody suggested manully change the field of “qdeep” to “0” in function “trfc_get_queue_info” to cheat SAP, so that old deltas can be kept in the cube. Can I ask your opinion?

    Regards,

    Jonathan Ji

    (0) 
  5. uday shankar
    Hi Pradip,
    It is too late to ask the question but i saw your blog recently thats why I am putting this question. To one of the question you replied that we can have values of the new fields for the old records by creating generic data source including new fields and key fields of the ODS.
    My questions are:-

    1.by which method generic data would be created. (view/table/founction module) If by using table then which table to be used.

    2. When data will be loaded using generic data source to the ODS then ODS will be having doulbe records including previos one along with the new one. Should we laod the data to a new ODS from generic structure??

    Thanks,
    Uday Shankar.

    (0) 
    1. Pradip Patil Post author
      Hello Uday

      1. You can create generic data source using view/table/function module..no problem.
      You need to ensure that keys of ODS should be present in data source extract structure so that you will get values for old records by matching the keys with ODS key fields.
      2.Since in update rules of ODS, you will map the key fields of ODS with key fields from generic data source, the data will be overwtitten. So values for new fields will be appended to existing (old) records in ODS.

      Hope this helps.

      Regards
      Pradip

      (0) 
      1. Sharath A
        Dear Pradip,
                      First of all an excellent Blog, second retaining the deltas is a TOO GUD point….

        Keep bloging….

        Cheers….

        (0) 

Leave a Reply