Skip to Content

I’m writing this blog to share my knowledge about the approach I’m using when someone comes to me with an Interactive Form that shows performance issues. The examples are based specifically on two forms I investigated last month and the experience I gained over the last years.

The first thing I do when I get a form with a performance issue is to open it several times (>2 times to make sure it is really an issue and not a one time effect I’m observing) in a row with Adobe Reader to see if the issue can still be observed when the PDF is opened multiple times. Adobe Reader (like any other application) needs to be loaded in memory by the operating system when used the first time (for example after a reboot). If a noticeable delay can be observed during subsequent openings there might be an issue with the form that needs further investigation.

Interactive Forms support two scripting languages; the form might contain a potentially large data set and algorithms processing this data. Like with any other programming language algorithms can be inefficient when the data set becomes large. Since there is no profiling tool available for Interactive Forms we need to take a closer look at the form and it’s scripts. And I have to admit a lot of experience with from design and scripting is needed to see what’s going on.

One of the forms had a large data file. Data is contained in XML format inside the PDF. It is expensive to process this data when the form is opened and it increases the file size of the PDF. In this case it was quite obvious that there is a lot of data (1 MB) that is not used by the form. So I always look at the size of the XML file that contains the data. The form actually had only six form fields. In such a case the recommendation is to reduce the amount of data that is contained in the PDF. Include only required dictionary types and deactivate attributes that are not needed for example when using transaction SFP. Ask the question why is this contained in the form or why does this show up in the data view?

By the way, when the file size is an issue I usually check out the used images to see what’s the resolution of the image and what is the file size. Another thing to consider is how many fonts are used for the form. Sounds strange but I really saw forms that were using several fonts (>7). Embedded fonts increase the file size. Why is the fie size important from a performance perspective? Sometimes the binary PDF form is sent back to the server instead of just the data. This happens in cases where there is a signature field on the form or attachments are used. Sending back and forth the PDF form can happen in Web Dynpro and hence also when using the ISR framework.

Going forward, let’s assume the file size is not a problem.

For investigating performance issues I pretty much always use Adobe LiveCycle Designer and the preview available in Designer. In Designer I use the form template (the .xdp file), a data file and a schema file (if available). You can get them as attachment to a PDF for example when you turn on tracing with the highest level in transaction SFP. After opening the PDF in Adobe Reader just save the attachments of the PDF as separate files on your hard drive. In cases where there is no schema file, I do the following: I copy the name of the data connection (you can do this in the connection properties, right click on the connection in the data view), then I delete the data connection (make sure to uncheck the option to remove data bindings for deleted data connection otherwise this will not work) and as last step I recreate a new data connection with the same name (that’s why I copied the name) and choose “Sample XML data” as the type of the connection. Some information might be lost due to no longer using an XML schema but you can use it. I did not run in issues with this approach in the last years.

When looking at the form itself, the first thing I’m doing is to check out the Data View, see how many fields are actually bound using data binding (you can see an icon for that in the Data View) or via complex binding (there is also an icon for that). This gives you a first impression. In two cases you cannot see if data is actually really used and these cases are implicit data binding and access to data via scripting. These cases need to be double-checked. First, check out if implicit binding is used. For implicit binding you need to select form fields on the form and go to the Binding palette of the Object inspector and see if the Default binding is “Normal“. You need to repeat this until you can judge whether most of the data fields are actually bound to form fields. Here you could again come to the conclusion that there is too much in the data that is actually not used (see recommendations above). But even if the form is complex I still need to check scripting as I describe later in the blog to come to a conclusion.

Speaking of number of pages and complexity, this is relative since we also need to consider the complexity of what is contained on each page. You can easily imaging that a form with 10 pages and one form field on each page is not a complex form. But a 10 pages form with nested tables and lots of scripting would be considered complex. The numbers I mention in this blog are not hard limits but guidance. This is an area where you’ll get a better feeling for the more forms you have created or analyzed. In other words if you are new this may sound like magic to you but if you are experience you get a good understanding whether a form is complex or not in minutes (I don’t want to say seconds ).

The second case is more difficult to spot. I’m looking for scripts that make use of variants of the following sample method calls (these are the most prominent examples of calls that might raise concern, there are more):

  • xfa.form.resolveNode(“xfa.datasets.data. …”);
  • xfa.resolveNodes(“xfa.record.UI_ATTRIBUTES.DATA[*]”)
  • itemNodes = itemNode.nodes;

Where to start searching is a good question? Start with these events in this order: initialize, layout:Ready, form:Ready, preOpen and calculate. It is useful to give the Script Editor more real estate so that more than one line of code can be seen at a time. The Dropdown-List “Show” where you select which event to display indicates if an event has a script by adding a “*”, for example “preOpen*” instead of just “preOpen”. The other useful option is to select “Events with Scripts” to faster browse all contained scripts.

The above-mentioned calls references data and return a set of nodes or a single node. They become really interesting if they are using “[*]” as part of the SOM expression. SOM stands for Scripting Object Model and allows describing references to nodes for example. These SOM expressions need to be interpreted and then executed what can take time (especially when they contain “[*]”). It is a string description that identifies one or more nodes that need to be found in the data model. After finding such expressions, I now double-check within the Data View whether there is also a binding shown for these nodes (the little icon to the right of nodes in the data view). This may or may not be the case and it is absolutely no problem if nothing is indicated. This case shows two things: a) see if data is actually used and b) points out areas where we may find performance issues. Let’s continue to investigate the latter.

Access to data usually comes in two variants. The first one looks like the following:

for (var i = 0; i < someLength; i++) {

    xfa.form.resolveNode(“xfa.datasets.data. …”);

}

Sometimes the string parameter containing the SOM expression is put together in the for-loop. Then resolving of the expression is done in the loop. Scripts like this are also indicators that the issue is in this spot.

The next thing is that often people iterate over the returned set of nodes. Check out the script if something like the following is used, as it is a good indicator for potential issues. Please note the use of the word “potential” since not every script like this is an issue! There might be good reason to do something like this.

for (var i = 0; i < itemNodes.length; i++) {

    do something with itemNodes.item(i)

}

It really depends on how large the data set is and how many times the script is executed. So this needs to be tested (see below).

Another point to consider is for how many different form fields this script is used. For example, if such scripting is used in the initialize event or the preOpen event of a Dropdown-List. The latter caused a delay when opening the Dropdown-List what felt very unnatural. And yes, I’ve seen forms with >70 Dropdown-Lists and in rare cases where multiple Dropdown-Lists were in a single table row that was during rendering repeated over multiple pages. In this case there were a lot of Dropdown-Lists.

This brings me to the next thing I want to highlight: tables. A table often looks fairly simple in the form design but during rendering a table row can be instantiated very often. If a table is large and contains scripts in the initialize event this script can be executed a large number of times. This is when I check out the Hierarchy of the form to look for some suspicious nested tables/subforms. A complex, deeply nested hierarchy impacts the rendering time since calculations get more complex. In this case I check for unnecessary subforms/nested tables.

Based on my experience I take a look at the following things on a form (in this order): Dropdown-Lists, tables (or nested subforms), form hierarchy and text fields. Some form designers do funny things with text strings and use inefficient scripting for it. Not a problem if this is done once but might be if it is done a thousand times or more often (e.g. formatting a text string in JavaScript for hundred thousands of fields in a print form).

Now, I’ve discussed where I look for performance issues and what is an indicator for potential issues. The remaining question is how do I determine if there is actually really a performance issues? To figure this out I comment out one script at a time (I want to know which script causes the issue) or create a copy of the form and delete the script. In rare cases I modify the data file to include more Dropdown-List items or make the table bigger by copying data rows. I copy the data file and quickly change the data file to be used in the form properties (found under the Edit menu in Designer). I’m doing this to address a typical issue with non-linear algorithms that they are somewhat okay if the set is small (enough) and show a pretty bad behavior for large data sets. When ever possible, I try to use the Preview offered by Designer. This speeds up investigations dramatically in my experience. Using this approach is OK for investigation and testing purposes but the final test should always use rendering with ADS on the SAP NetWeaver stack (i.e. test the full scenario). The reason is that Designer contains form-rendering code but this code might not be the same version as used by ADS (often older or not including the latest bug fix). That’s why it is called Preview and nothing else. The rendering in Designer takes place when you switch from Design view to Preview.

Once an issue is identified as a performance problem it is necessary to figure out how to resolve it. Optimizing the script code is the first thing to look at. For example, if a script in question is using multiple similar calls to resolveNode or resolvedNodes on thing to consider if results can be cached in a variable. Is inefficient JavaScript used? Sometimes the business logic forces a loop over a data set. Can the data set be reduced when the form is generated on the server? This really depends on the use case and can only be decided case by case. Now is the time when we need to understand what the form is doing to allow us to find other solutions.

Last but not the least, I want to stress that like withany other technology it is also the experience and skills that count. In my seven years working on Interactive Forms I’ve seen many people making the assumption creating forms is easy and thinking that they can create forms without training or reading documentation. I can’t count how many times someone told me that this is the first piece of JavaScript code they wrote. I have seen many forms where it was clear that the form was created using a try and error approach and sadly was never clean up after it worked but enhance to do more.

That leads to the final recommendation to use an iterative approach to develop the form and test performance for each stage. This includes working with real world data, e.g. a realistic number of items in a Dropdown-List etc. Then it is not necessary to search for the script or change that causes the performance issue since it was the last change.

P.S: There is a new SAP Note #1666920 that collects links to performance relevant information.

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply