We had an interesting project challenge recently. We needed to stress test our BW system prior to the first go-live. We also wanted a way to easily run a large number of queries every day, in order to get some realistic BW statistics data and establish which were the slowest queries.
We tried a number of the usual suspects first. Initially we thought we might be able to use RSRT to schedule query refreshes, but couldn’t find a way to do this without quite a lot of programming. Then we tried using BEx Analyzer workbooks with multiple queries, but these got refreshed in serial rather than parallel and therefore didn’t stress test our system properly. Next we thought of trying to use VBA to make multiple connections from BEx, but decided the effort would be significant. Next up we tried putting a large number of queries into a web template, and accessing this. This was promising, but it looked like the queries were being run using just one dialog process. This method also needed a lot of manual effort to set up and maintenance would have been an issue.
Then I remembered some SAP tools which I’d used before for testing web template performance (iemon and httpmon) but again these could only handle one thread at a time. But then this got me thinking about a download manager that I use a lot called Reget (www.reget.net). It’s a great piece of low cost shareware and is normally used for automating downloads from the internet and making them faster by creating multiple threads. The tool has some great features that allow you to specify a username and password for a site, automatically rename and organise the downloads, SSL support and more.
It took me just twenty minutes to get a prototype working. First I obtained a list of queries from the BW system from table RSRREPDIR for the relevant InfoCubes and ODSs. I pasted these into a text editor (Textpad from www.textpad.com) and then it was simple to generate a long series of URLS using regular expressions and search and replace. The final URLS needed to be in the format:
These were imported into Reget using the Import URL list command:
I immediately paused this list of downloads and changed some settings to make things work better.
Firstly, in the site manager tab I set the username and password that I wanted the tool to use.
I could have limited the number of connections here on a site by site basis, but decided to use a global parameter instead.
Then I set the option to automatically rename downloads that have the same name.
Then I set the global properties for number of active downloads to 16 to match the number of dialog processes on the system.
Then I just hit the green button to start downloading, and hey presto, a maxxed out, very busy system (my username was CM8888 in case you hadn’t guessed).
There are some obvious shortcomings which I’m working on. Firstly any queries with user entry variables won’t work. I’m hoping that by using the web API it will be possible to hard code these into the URLs. Secondly just the top level query is run – it’s not possible to simulate any form of drill down except by using the URL – again one for the web API. There is also an issue with Netweaver 2004s which uses a different logon mechanism. I’m hoping to either be able to use a username and password in the URL, or single sign on. If anyone has any ideas, please let me know.
So basically now for minimal cost, we’ve got a repeatable way of running the same queries day after day and also running them in parallel in order to stress test our systems. It’s also really easy to run the same queries on development, test and production systems just by changing the hostname in the URL. The Reget history tab also gives a pretty good log of what time the series of loads started and finished.
I know there are some top end tools out there from companies like Mercury, but this meets about 90% of our requirements for now.