Skip to Content

So at this point we were 6 months into the project, my team had completed the PoC, Dev and QAS upgrades on all the applications – working with the new hosting provider as they conducted the data centre transition.

During this time we faced many issues around performing Upgrades and Unicode conversions using Parallel Export/Import on VMWare systems (that is a subject for another day), as a result of these issues I had a nervous client who was not convinced that we could pull this off within the downtime window. This upgrade (designated TC1) was to be the proof of the pudding, the last 6 months of work and preparation was going to be in full visibility and it had to succeed within the window or have very good reasons why it did not.

As my team got closer to the upgrade time, we refined the plan more and more – I had worked with my project manager on a previous project and knew he was an absolute whizz at MS Project, but he outdid himself this time. The project plan we had was based on ASAP, with nested plans and cascading dependencies throughout. This made it very easy to update the top level and see the knock-on effects of slippage. Finally I conducted walk throughs with people to make sure the timings were reasonable and that no-one was staying up too late.

As we had 4 days downtime, I had a technical time budget of 60 hours to do all the upgrade work across all the applications. So we worked out a shift system which enabled the 6 people on the upgrade to have rest periods during the upgrade.

It is important that the responsibility be shared among the team members, but utimately there has to be a single person responsible for each application. That person has to be the expert in that particular system during upgrades, if there is an issue they have to be involved in the troubleshooting, although they do not get the final decision (that’s the team lead’s repsonsibility) they are pretty close to it.

The table below outlines the known issues that we had going into the trial cutover and the measures we had developed to try and overcome them.

Apart from the known technical issues, we also had non-technical issues

In every upgrade it is vitally important to be able to guage the performance of the upgrade so that you can predict things like, are we on course, do I need to reschedule people, what is the report to the project, when can I sleep. To help with this, SAP have provided an excellent analysis file – called the UPGANA.xml file, you will find it located in the HTDOC\ directory under DIR_PUT. As you can see from the picture below, it has much of the top level information –

It also has all the timings for the upgrade phases, which is vital for keeping an eye on your general upgrade performance, you should use the log files for specific timings during the process, but I find this file works well for project managers 🙂

The table below shows 5 of the longest running phases in the Uptime and Downtime part of the PoC and TC1 upgrades for comparison.

As you can see, the PoC was a much slower upgrade, and for the risk adverse among you, you are probably wondering how I managed to keep my client on my side with timings like those of the PoC (I’m not telling.) The TC1 times show that the PoC was massively under powered in terms of CPU, this can be seen in the difference in the activation times, which are CPU bound and the similarity of the Import phase which is I/O bound.

In terms of overall runtime, the table below summarises the main phases as I have them recorded

As you can see from the table above, we blew our budget on time massively – which was concerning for everyone and we had a lot of explaining to do, but we had captured a lot of good data as to why we had issues, what we did to resolve them and how we can mitigate them for TC2.

This is shown (as usual) in the table below


As I have said above, we blew the transaction log several times during this Unicode conversion and it is important that I explain why this happened. When running a Unicode conversion, the exports are usually fine, as they are read operations – imports are write operations (DML), these will be captured in the transactions logs/online logs unless they are noted to be part of a Bulk load (Oracle, DB2, SQL Server) and there are SAP Notes to enable you to do this. Similary if you are deleting the records of an entire table, you use the Truncate table command, which is non-logged as well – so far so good.

Truncate table BKPF

The issue with transaction/online logs raises its head when you are deleting records because a Unicode import process, for a table you have split, has failed and you have restarted it. This uses this type of command

Delete from BKPF

WHERE (
“GJAHR” <= ‘1009’ and “BELNR” < ‘006919999’ and “BUKRS” = ‘XX01’ and “MANDT” = ‘100’
)

This is a logged operation, and depending on the size of the record set – it could blow your transaction log, but it is unlikely to do this on it’s own, more likley as a group of repeated packages performing deletes.

The final issue we had was not being smart about how we restarted all the failed Unicode processes, because we did not manually restart failed processes until the end of each servers Unicode run, when we did restart them we had over 50 Unicode processes all trying to delete from the DB at the same time – as shown above this is an easy way to blow your Transaction/Online log, it caused us a great deal of pain and contributed to the database crash, but we learnt a great deal from it and we applied those lessons to TC2.

So we finally made it through TC1, collected a wealth of important data (which we’ll analyse in the next post) and utimately re-affirmed we were on the right path. Next up was Trial Cutover 2, all we had to do was make through Christmas alive!

To report this post you need to login first.

8 Comments

You must be Logged on to comment or reply to a post.

  1. Bala Prabahar
    Chris,

    What I follow is “Slow and steady wins the race”. I normally start both export and import process with low degree of parallelism. I monitor CPU utilization. I gradually increase the degree of parallelism based on CPU utilization.
    Not sure I understand “Deletion” step while performing import for Unicode Conversion. Wouldn’t import run on a new system?

    Regards,
    Bala

    (0) 
    1. Bala Prabahar
      Import will generate a large volume of transaction logs (in SQL SERVER) not because of deletes but due to “INSERTS”.
      The quantity of transaction logs can be minimized in SQL SERVER by following note correction 1241751.
      Thanks.
      (0) 
      1. Chris Kernaghan Post author
        Bala,

        The experience of this particular Unicode process was that the Deletes blew the transaction logs – but in order to keep the post reasonably database agnostic I simply referred to the SAP Notes as requirements for database performance tuning so this would cover SAP Notes like SQL Server – 1241751, Oracle – 936441, DB2 – 454173 & 822251

        We did have the trace flags from 1241751 turned on which is why we had the deletes blowing the transaction logs.

        Chris

        (0) 
    2. Chris Kernaghan Post author
      Bala,

      I prefer – It’s a marathon not a sprint 🙂

      As regards the parallelisation, either approach works- I prefer to know the limits of the hardware and give myself some head room which I can tap into if I need to.

      Understanding the consquences of failed packages is important due to the way the R3Load process run the deletion of the previously imported records. In an ideal world, no packages would fail, but for reasons best known to R3Load some do fail. So I felt it was important to talk about this, rather than gloss over it.

      Thanks for reading

      Chris

      (0) 
  2. Tom Cenens
    Hello Chris

    Nice blog series as the topic is very interesting for technical #sapadmin community members.

    I still remember the first day I started as a consultant I attended a unit knowledge session and they were talking about CUUC and all those technical terms really didn’t make any sense to me as I just finished school and I didn’t know much about SAP.

    I was wondering if you use any particular information sources that are noteworthy. For example the upgrade SAPPress book or the SAP migration upgrade course or other sources?

    I know the author of the upgrade SAPPress book is known for handling upgrades within a limited time frame, optimizing the whole procedure as much as possible so I would think the book is interesting but I haven’t read it yet though (ordering right now hehe).

    Kind regards

    Tom

    (0) 
    1. Chris Kernaghan Post author
      Tom,

      To be honest I started doing SAP Upgrades by doing, I read the manual, then the SAP Notes for 5 days and started with a Proof of Concept upgrade. From there I took a screenshot of every screen and documented every switch or fix I implemented. This was the beginnings of my run book, this was then refined in Dev, but still kept the SAP manual and notes handy. Once we got to QAS, I only used my manual for the upgrade.
      I have a rule that anyone who brings their SAP Notes and Manual with them to the PRD upgrade will get a stern talking to. You should be running from your manual, the SAP notes and manual are fine – if things go wrong. When running the process normally, your manual should be your bible.

      If you are wondering about where to go to get specific information on how to run an SAP Upgrade project, then people should look at the ASAP road maps in Solution Manager – these are comprehensive libraries of documents and processes which a good upgrade project should reference.
      For information on the actual upgrade, it’s phases etc.. the SAP Press books are excellent resources and written by the guys developing the upgrade tools, so there is no better source of detailed information. Are they the best source of information on how to run an upgrade, I am not so sure – every upgrade is different to some degree, but these books will provide solid background material.

      Thanks for reading and glad you are finding it interesting.

      Chris

      (0) 
      1. Tom Cenens
        Hello Chris

        I have to admit I haven’t read the book or done the course but I don’t find it to be a prerequisite either.I’m not stating one should, I was curious to know if you had or not.

        I have done a CUUC last summer at one of our customers and it worked out fine. The database wasn’t as large as what you mention in your blog so I didn’t have the same kind of timing issues.

        I always go for very detailed documentation in such cases and indeed by the time you do PRD you should have proper documentation to make sure you don’t need any notes or manuals.

        Regarding the SAPPress book authors, one of the authors is specialized in doing SAP upgrades. In fact it is like 90% of his business so he should be a very good source of information depending on if he gives away all his tips and tricks of course.

        Kind regards

        Tom

        (0) 

Leave a Reply