Performance Tips and Tricks when beginning the IdM Journey
From working some time on the IdM topics there are some basic activities that can help performance of the system if you are new to IdM.
The first good source of information is the ‘SAP Netweaver Identity Management Solution Operation Guide’ found at this link
Section 4.6 of this guide covers Analysing Statement Execution if you need to identify any long running SQL statements in the system. Via the IdM administration UI such statements can be traced based on a minimum runtime threshold as detailed below
For more detailed analysis of SQL statements see Per Krabsetsves’ excellent blog at this link
Section ‘5.6.7 Rebuilding database indexes’ advises
With heavy usage of the system, the database indexes will become fragmented, which may
For further information regarding fragmented indexes and rebuilding the indexes, please refer to
the documentation for you database system.
Most often if your system is suffering this issue you will see system wide performance issues. The UI, jobs and tasks will all perform more slowly than normal or the system could in most severe cases come to a standstill. Keeping the indexes refreshed is essential as much of the processing in the IdM application occurs at database level.
Performance Issues in the UI
If you have a reference attribute assigned to a UI tasks and this takes a long time to open then there maybe an issue in loading all the reference attributes. In the MMC there is the option ‘List Entries on Load’ which can speed up the loading of the UI task until the root cause of the performance issue can be determined. This checkbox when unchecked means all the attributes will not be loaded automatically in the UI rather the user must search them after the UI opens
Secondly complex access controls on UI tasks are a common cause of performance degredation in the UI. Check the SQL statement used in the access control and see if it can refined anyway to make it faster.
Using No Lock on queries MSSQL queries
If you need to read a large dataset from a MSSQL database then (nolock) hint should be used in the SQL statement e.g.
select * from idmv_link_ext with (nolock) where ……….
In addition remember that storing such data to the IdM database involves making numerous updates therefore if the read from the database takes X time it is not that the case that the update to IdM will also take the same time. The same stands for other databases.
Long running Jobs ‘Cookie Does not Match’
If a job runs for a long time it may abort with the error message “Cookie does not match” which means that the Identity Center
does not have this job in its list of active jobs. When the runtime starts running an action task, it will “check out” the job from the Identity Center.
While the job is running, the runtime will periodically signal the Identity Center that it’s still active and running. At this point the Identity Center can
return a status code to force the job to stop running. In this case the runtime will do a controlled exit. such an issue may arise for example where you are running an initial load from an ABAP system that has many abap roles and profiles. It can take some time to read all these into IdM and thi saction can timeout.
There are 2 timeouts for running jobs, The “Start timeout” and the “Idle timeout”. The values are configured in the MMC, on the
Identity Center node, in the “Options” panel:
The “Start timeout” is the maximum time allowed from the job starts running, until it has processed the first entry.
The “Idle timeout” is the maximum time allowed between each time the runtime signals the Identity Center that it’s still active and running.
Top try and resolve this the below settings should be considered
1. The Idle timeout has to be large enough to handle the maximum system load, when there can be a large number of queued jobs.
2. The Start timeout has to be larger then the time it takes to initialize the connector + the time it takes to process the first entry.
3. The Execution timeout has to be larger then the maximum time spent processing one entry.
These are just some small steps that can fix issues that cause severe impact on the running of the IdM application in your organization. I’d welcome any other hints other people have learned over the years of working with IdM 😉
Best of Luck
Thanks for the knowledge sharing. I have another suggestion to the IDM dev team.
When we build up a SQL query in the script, usually we put static value (e.g. mskey) as the search criteria. Is it possible to add new functions to DSE so that we can put place holder instead of the static values?
select * from idmv_entry_simple where mcmskey = 1234
select * from idmv_entry_simple where mcmskey = ?
We can reduce the SQL parsing time in this way. It is a standard feature available in competitor's product.
Which use case do you have for this?
For a case like we need to use SQL query (via uSelect function) in a custom script to read something from IC database. for instance the query is like
var sSQL = "select * from idmv_entry_simple where mcmskey = "
then the sql sentence is amended by adding the mskey value to the end. Once the SQL sent to DB engine, it needs to be parsed, which costs time. If mskey is changed, the SQL needs to be parse again. On the other hand, if the SQL is using binding variables, it needs to be parse only once and will be stayed in DB memory. The query execution will be faster the next time.
Currently we do not have a DSE function to use prepared SQL statement, which I think we could have in future.
I see what you mean, but honestly, I do not think that you will get a performance benefit if these are prepared statements. All the scripting load and framework initialisation consumes more time than parsing a simple SQL String.
But, as you are able to use Java in the JS Scripts, just try out by yourself 😉
I guess that is possible for conditionals/switches and similar where there already is a fixed syntax for %MSKEY%, %AUDITID% etc. Those values could be pre-declared as variables @MSKEY, @AUDITID, ... and used in queries issued as prepared statements. I'd guess the challenge is to create a syntax that is parsable enough to transform to each DB supported by IDM (MSSQL, ORA, HANA, DB2..). The ? thing is perhaps an ADO/OLEDB based solution?
If you're worried about the performance of custom queries in conditionals and switches you can make your own procedures and use those. You could also try to use something like "declare @mskey int; set @mskey="+mskey+"; EXEC sp_executesql N'
select displayname from .... where mskey = @PARM',N'@PARM INT',@MSKEY" in uSelects in scripts. I probably got the syntex wrong and I'm not sure if it will work. It's been too long since I used IDM or wrote much SQL at this point 🙂
MS SQL Server does a pretty decent job with the default simple parameterization option and plan generation/reuse, but it can be fun to see the effect of setting the "forced parameterization" option (can be done and reverted with the database online). In some cases the difference is amazing. It does have it's own weaknesses though.
Thanks for your feedback. But I suspect when you do this in condition/switch tasks
select * from idmv_entry_simple where mcmskey = %MSKEY%
the IDM engine will convert %MSKEY% into a static value first in IDM and then send the static SQL to SQL engine to parse. In case we can thousands of distinct mskey values, the DB needs to parse thousand times, which reduces the DB performance.
What I suggest is to have a similar function like PrepareStatement in Java language.
PreparedStatement (Java Platform SE 7 )
Once the statement is prepared, the SQL engine doesn't need to parse it next time, no matter how many variables we have.
The benefit of prepared statement can be found here.
Using SQL bind variables for application performance and security
Generally I am quite happy with the SAP IdM performance. But it is better to have this feature when the competitors provide the same. 🙂
I believe we're saying the same thing, atleast it's what I implied with "and used in queries issued as prepared statements" in my text 🙂
I'm not sure why custom SQLs are run as they are, it's been a year since the IdM Dev. office I worked in closed, and probably years before that when those decisions were made. The IDM implementation used, and still uses I suppose, prepared statements throughout the java implementation. It's good practice, does wonders for performance and sql injection problems. Implementing parsing and prepared statemtn genereation for the user definable sqls might be a good idea and an important feature request.
One thing does come to mind. The provision queue may not be ordered and using conditionals as example: You get a list of different conditionaltasks to execute and they are not sorted by task-ID but by auditID to execute the oldest tasks first. You could end up setting up the prepared statement and execute it once before you need to set up a new one, then go back to setting up the previous one again. Change the order and you change the processing priority. Is setuptime worse than parse time, does it put less load on the DB, do you implement a mechanism to keep these generated prepared statements available to work around this? But these are problems for the new IDM dev. team to solve 🙂
Thank you very much for sharing your knowledge here. 🙂
I have below question to you and all other experts.
I have IDM UI performance issues for MX_PERSON entry type due to MX_ASSIGNMENT attribute.
By keeping 'List Entries on Load' checkbox as disabled, I can open related UI tasks very quickly but
there are cases where we need to see the data of this attribute in the UI task.
Now, if 'List Entries on Load' checkbox is disabled then we need to load the attribute values after opening the UI Task and here again it takes lot of time to load the data.
We have Role1(Position)->Role2(Security Set)->Role2(Parent Role)->Privileges mapping and could be complex structure we also could cause performance issues (this concept cannot be changed).
But when I try to find out SQL queries etc. which would run on loading the MX_ASSIGNMENT attribute for MX_PERSON entry type, I dont see anything coming up when I do Trace and Statement Execution using IDM Admin UI.
What could be the reason for MX_ASSIGNMENT to take lot of time to load in IDM UI and how this can be improved ?
Looking forward for expert's advice here.
Thank you and Kind Regards,
in terms of the statement execution trace it maybe a case you need to increase the threshold value to try and identify the statement(s). Also another useful tool is the SQL trace on the SQL monitor on the AS Java server hosting the IdM web UI. This will capture the sql statements run from the Ui against the IdM database and may help in analysis.
In such cases always check for SAP Notes as there have been performance improvements made in terms of the display of assignments.
Thank you very much for your reply.
1. Statement execution analysis or trace option does not display any SQL statements/anything when searching MX_ASSIGNMENT values in the IDM UI for the user.
Why threshold value to be increased for this ?
If I set it to lower value (1000ms or even 1ms), it should show me all SQL statements running for this activity.
2. Regarding your point "Also another useful tool is the SQL trace on the SQL monitor on the AS Java server hosting the IdM web UI."
Is there any documentation or link to help me to explore this tool ? This analysis would help me a lot to narrow down on our performance issues.
3. I checked and found SAP note 2129188 - Bad Performance When Opening UI Task Having MX_ASSIGNMENT attribute
but this seems to be applicable for IDM 7.2 SP9.
I have IDM on IDM 7.2 SP10 now and this note & older notes does not seems to be applicable here.
We were able to improve the performance of MX_ASSIGNMENT attribute('List Entries on Load' checkbox enabled) related UI Task significantly with some DB configuration (statistical view) and is explained in SAP OSS message raised to SAP for review.
Hi Pradeep. If you can, could you also share the actions and results here in this post as we cannot see your OSS message to SAP?
OSS message is 623487/2015 .