Additional Blogs by Members
cancel
Showing results for 
Search instead for 
Did you mean: 
JimSpath
Active Contributor
0 Kudos


To get past our BW 7.x upgrade and move onto more productive work, we need to get a portal working, which means a Java stack.  While we have an SAP Java "stack" running with SCM, BW is quite different.  After a few load tests, I realized I need to collect more data than in previous upgrade and conversion projects.

"More data" doesn't just mean more runs of the same tests I've run in the past, it means different tests.  A few years back, when comparing hardware and software platforms, I researched what tests for Java environments exist.  Since I don't write it myself, I depended on published benchmarks, available test suites, and other historic information.  An ideal test would be easily and quickly run, have repeatable results, run on a wide range of platforms, and correlate to common applications.  No sense running a graphics intensive suite if that isn't what our business will be using Java for.  It would also be great to view online lists of test results, especially where years of data exist.

Two of the tests I've continued to run are the Linpack test and the SciMark suite.  You may have other favorites -- I'd be interested in hearing about them.  I run other tests besides Linpack and Scimark, such as calculations of pi and prime numbers, but I'll focus on just two.

The Linpack test

It is is available online
http://www.netlib.org/benchmark/linpackjava/
http://www.netlib.org/benchmark/linpackjava/timings_list.html

This test measures floating point performance, or more likely, how well the delivered math libraries behave.  The main difference seen in my recent tests are the better results from the "J9" Java runtime.  

The SciMark test


http://math.nist.gov/scimark2/

This test suite measures several math operations, including Fast Fourier Transforms.  They might not correlate well with what SAP Java code does, but I'm using this more to compare the quality of the Java implementation as to compare various hardware platforms.  I need to ensure that new architectures don't quietly degrade performance, and this is one way to find the outliers.

The test generates a "composite" score, which I've deleted here to focus on the individual tests.

The FFT runs of the Java libraries based on 1.4.2 are nearly identical; there is a 10% gain with the 1.5.0 version I tried.  As SAP doesn't support this, that version isn't too useful.  On the other hand, finding improved performance on an unsupported version generally lets me open an issue since the supported version has room for improvement, with a demonstrable test case.

The SOR ("Jacobi Successive Over-relaxation") test is also flat, through all Java versions I tested.

The Monte Carlo test shows the biggest improvement in an unsupported 1.5.0 version.

The Sparse Multiplication test degraded under both the supported J9 option, and 1.5.0.  Prior versions had better results with 32 and 64 bit flavors.  I can't explain why this test devolved the way it did.

The last test, the LU (matrix factorization) was the only one where recent versions, especially the SAP supported J9 option, showed improved results over other tests.

 

 

 

 

Notes

723909 - Java VM settings for J2EE 6.40/7.0
1044330 - Java parameterization for BI systems

Wiki pages

Common Java issues

Parameter settings (new draft)

Next steps:

I plan to run more tests as more hardware and Java releases are available. I will also run the SciMark large memory models to gather more data.

Other test suites would be great for even more data!