Analyzing SAP background job data using SAP Lumira
Performance can be impacted by background jobs so for an administrator it is interesting to get an overview of which jobs are running on a given managed SAP system.
A tool that is not so well known in SAP is the BACKGROUND_JOB_ANALYSIS tool. It is available through transaction ST13 and the tool itself can also deliver interesting results in terms of analyzing your background jobs.
The main interesting feature of the tool is the gant chart that it produces when you look for jobs that run for over five minutes If we look at a 24 hour time slice, we can see how the long running jobs are distributed for example.
Floating over one of those blocks with the mouse cursor reveals the background job name and a table on the same result page also displays statistical data on these jobs. The > blocks represent jobs that run longer than five minutes. The * at the bottom next to SJ are short running jobs that run on the same moment in time.
We can derive from above chart that from 00:00:01 to around 05:00:00 the system has some long(er) running background jobs. The above graph isn’t really shocking but I can image you can find SAP systems where the graph looks a little more packed. It could be interesting to look at how those long running jobs can be distributed better over time. Around 04:03 for example you can see a couple of long running jobs run in parallel. They could potentially bottleneck eachother so it could be interesting to schedule one of them at another point in time.
In the job options (two screenshots back) you can select All (only listview). This will give you an overview as you can see in above screenshot. This data can then be exported into Excel. What I did was take the data of five working days 05.08.2013 – 09.08.2013 and combine the data into one Excel sheet.
Then I fired up SAP Lumira, I clicked the shiny “new document” button and browser for my Excel sheet. In above screenshot you can see an extract of the data after it was aquired by SAP Lumira. Note that I did adjust the Excel sheet to some extent, I threw out a number of column which I was not planning to use and I added a column “StartHour” which derived the hour in a “hh” format from the StartTime column because I wanted to see the spread of the # jobs on a 24 hour axis.
How to create a bar char of the top 5 most running jobs
After pulling the data into SAP Lumira, I navigated to the Visualize view and I ended up with 10 Attributes that I can use as you can see in above screenshot.
So now, I want to create a bar chart of the five jobs that run the most often. In order to do that I first have to count the number of occurences of each job in my dataset.
Most tools feature a right click menu so that’s exactly what I did, right click the JobName attribute. Looking at the available options (not all options are shown on the screenshot) the option that makes the most sense to me is “Create a measure” so I clicked that option.
Once I did that, I noticed a new measure under the header “Measures”.
The button next to the newly created measured triggered my attention so I went to check out what options are underneath. By default it was set to Count(Distinct) and my gut feeling said I should switch this to Count(All). I was going to verify the data anyhow against the Excel sheet for the first time to be sure I was picking the right option.
So now, in the right pane, I choose the “bar chart” type of graph and it was now time to define the X Axis and the Y Axis.
On the X Axis, I want to see the jobs and on the Y Axis I want to see how often they run. I dragged the measure that I had just created onto the Y Axis 1 field and the JobName onto the X Axis field and voila, I had a graph. Now you can see that my graph is rather “busy”. I have way too many small background jobs in which I’m not really interested so I have to narrow down the result. I would like to take a look at the top 5.
Again, by just looking around I saw a small downwards pointing arrow in the measure that I dragged onto the Y Axis 1 field so I clicked it to see what options are available. Rank Values sounded like the thing I was looking for so I clicked it to check it out.
By default Top was filled in along with the value 3. I simply changed 3 into 5 and clicked OK.
Et voila, here is my graph of the five most running jobs (looking at a period of five working days).
Using the same techniques as above, I created a few other graphs as you can find below:
Here you have an example of a combination of different measures (duration & frequency). So here I have a map graph which shows me how often a job runs (the size of the map piece) and how long the job runs (darker colour = the job runs longer).
This combination is interesting because the jobs that have a large map piece and are dark coloured are the jobs that you would definitely want to look into from a performance perspective, they are potential candidates for optimization.
There can be lots of interesting graphs that you can build using this data. For example the above map graph adds in the user-id that scheduled the job in SAP. Note that I did change the real user-id to ENDUSER1 😉 for AI_SDK_SP_AUTO_IN_PROCESS.
This can help you identify who scheduled in a long running or often running (or both) job. It can also show you duplicates. You can see that SMBI_INVALIDATE_OLAP_CACHE now has three blocks so it runs under three user-id’s. This is then interesting to look at to see if this is normal or if this job is running multiple times, doing the same thing, while it is not necessary.
I can come up with more examples but I’m sure you would like to try out SAP Lumira yourself. The above examples are easy to reproduce so go and have a look at your background jobs 🙂 .
Nice one, Tom! Never thought of using Lumira on system stuff myself, but it might actually persuade IT/SAP managers to go for Lumira.
Hi Raf
Thanks. I wanted to try out something related to my role so I ended up using this dataset as an example. I definitely think SAP Lumira is pretty easy to use and good for quick analysis of datasets like this.
Best regards
Tom
Hi Tom,
Nice to see such a visualization tool being used in the sysadmin domain. It's not the first place I'd look, but hey, anything can be visualized, right? 😎
It's a good example for sure!
Cheers, Fred
Hi Fred
Thanks.
These kind of visualizations (on all sorts of statistical data) can certainly help us out to get insight. I can see a place for SAP Lumira in my list of programs 😉 .
SMBI_INVALIDATE_OLAP_CACHE was running double by the way with the same variant 🙂 so I improved the managed SAP system of the example by removing one of the released jobs.
Best regards
Tom
That's really interesting Tom, thanks for sharing,
I found with Lumira a perfect solution to easy and quick way to analyze from SAP Systems. I'm just checking it with Solution Manager and think that the join from solman and Lumira could be a perfect way to analyses that huge amount of data from solution manager better than currently tools like bw reporting
Regards,
Luis
Hi Luis
Thanks.
I agree with you that SAP Lumira can be interesting to use against SAP Solution Manager because it can provide a quick / easy way to visualize data.
For certain visualizations you will need to adjust your dataset before importing it into SAP Lumira but that's not a big problem in my opinion.
I'm looking forward to seeing more examples of SAP Lumira being used against #sapadmin or #solman data.
Best regards
Tom
Tom,
great work and actually I think you might have tapped on something very interesting here. With this tool you can easily add quite some value to the Admin / SAP Basis team. Could it make sense to further integrate System information which might be coming from the Hypervisor etc to get a complete picture?
I believe proving the admin a tool like this one could be a great way to get their buy in!
Well done!
Hi Carsten
Thanks.
It could definitely be interesting to add in CPU / Mem usage (and all kinds of other data) for the above scenario. Of course the system I used isn't really in trouble so it would be more interesting to analyze a system that is running worse as the analysis would deliver greater value in that case in terms of insights & possible actions that flow from them.
In the end it comes down to asking the right set of questions and visualizing those to make it easier to interpret the answers on those questions.
If I have some spare time I'll have a look at other #sapadmin #solman related data. It could even be interesting to pull in data from Solution Manager Technical Monitoring for example to cross polinate data that is normally not crossed.
Best regards
Tom
Hi Tom,
its is very nice to know that we can use this tool for most of the basic stuffs beyond BI and HANA.
Thanks for the sharing.
Regards;
Jansi
Hi Jansi
Thanks for the comment. Yes, it can definitely have it's place. I think it also can go beyond basic but I kept it basic in this example. I've seen some complex examples already floating around so you should definitely have a look at some of those examples to see what the tool is capable of.
Best regards
Tom
Hi Tom,
Very good effort. I'm very new to Lumira. Can you please explain how can we share the Lumira data set with cloud, while i'm click on share button it is asking some thing url, application, source and etc.., Help me to fix my problem.
Regards,
John.
Hi John
You can only publish to one of those sources if you have an account (depending on what share option you use) of the system or application where you want to share it to.
Best regards
Tom
check http://scn.sap.com/docs/DOC-41354
Its Nice Tool, Thanks for sharing Tom. I'll Surely use it.
Best Regards,
Rama
Hi Rama
Thanks for commenting!
Best regards
Tom
Hi Tom,
Good Day!
Nice analysis! I appreciate.
Keep sharing new things. Have a nice day!
Regards,
Hari Suseelan
Hi Hari Suseelan
Thanks for your comment, have a nice day.
Best regards
Tom
Hi Tom,
You are welcome. Thanks.
Regards,
Hari Suseelan
Great job Tom! Thanks for sharing. Interesting use of SAP Lumira.
Cheers,
Raquel
Hi Raquel
Thanks 😉 .
Best regards
Tom
Hi Tom,
It reminded me of another also free tool sapometer for analyzing Background tasks with similar visualization but more convenient for getting results, I think.