Skip to Content
Since SAP NetWeaver 2004 SP Stack 15, the portal has included a navigation cache, which provides better performance when the navigation service needs to retrieve navigation nodes for a user. The first time the user enters, the nodes are retrieved from each navigation connector. The next time the user logs in, or anyone with the same combination of roles, the nodes are retrieved from the cache. In SP Stack 20 (out in April) and in 2004s SP Stack 11 (out in February), the caching has been redesigned to improve performance, particularly in large organizations where there are many users with varied combinations of roles. The default roles connector has implemented the caching mechanism. Any custom navigation connector can be implemented to take advantage of the caching. In addition, there are administration tools for configuring the cache of each connector. More on this next week.

How the New Caching Works

With the old cache, the navigation service asked for a single discriminator from each connector and then asked for each connectors initial nodes (entry points) for the current user. It then combined all the discriminators into a single discriminator, and stored the collection of all the initial nodes from all the connectors under this discriminator. The new caching mechanism enables caching of each connector’s nodes separately. And each entry point can be cached separately. That means that if one user has Role A,B and C, each of those entry points are cached separately. When another person logs in with Role A, this entry point is retrieved from the cache — even if the user doesn’t have the exact same combination of entry points. To achieve this, each connector is automatically wrapped in a cache connector when it is registered with the navigation service. The cache connector takes care of the work of checking for cached nodes. image When the navigation service needs to get the nodes for the current user for a specific connector, the following occurs:

  1. The cache connector queries the original connector for a list of discriminators — each one represents an entry point.
  2. For each discriminator, the following occurs:
    1. If a node for the discriminator exists in the cache, it is retrieved.
    2. If the entry point is not cached, the cache connector calls the getInitialNodes() of the original connector, passing the discriminator, and the entry point is retrieved and placed in the cache with the specified discriminator.
  3. All the entry points are collected and returned to the navigation service.

Benefits of the Navigation Cache

Let’s take a second to try to quantify the benefits of the navigation cache.

Navigation Cache vs. No Cache

Load tests showed significantly improved performance when using the new navigation cache vs. not using any navigation cache. The following improvements were found:

  • CPU consumption during logon was reduced by 50%.
  • Logon response time was reduced by two-thirds. In other words, if it normally took a second to log on, it took 0.4 seconds when the navigation cache was used.
  • Memory consumption was lower by 20%. This refers to the heap memory. In addition, the heap memory consumption was more even, since large amount of heap memory was not needed each time a user logged in to calculate navigation information, as the navigation information could be taken from the cache.

New Cache vs. Old Cache

Basic tests of the old and new cache showed that the old cache was slightly more efficient. This is because the old cache took a set of already merged entry points (and their navigation nodes) and placed them in the cache. Every time a request came in the needed the navigation nodes, they were simply taken from the cache. The new cache places each entry point into the cache, before merging them. So, each time there is a need to get a user’s entry points (and navigation nodes), the merging of the nodes still needs to take place. It is this merging that costs you in terms of performance. But, in a large system landscape with thousands of users, this slight performance hit is more than offset by the more efficient ways entry points (and navigation nodes) are stored. Instead of caching every combination of roles, each role is cached separately, and only once. For example, if there are users with Roles A,B, Roles A,B,C and Roles A,B,C,D, in the old cache, each of these entry points and their navigation nodes are stored separately. Meaning Role A and B are cached 3 times, and Role C is cached twice. In a landscape with many combinations of roles, chances are that the cache will fill up and roles will have to be reloaded to cache, a costly operation. With the new cache, all the roles are cached just once, and it is more likely that roles will not be evicted from the cache.

Cache Storage: Memory vs. Database

Administrators have the option of storing cache information in the portal database or just in memory. Either way, navigation cache is kept in memory for quick access. When selecting the database option, the cache is also stored in the database, so that it is still available after a restart of the server.

To report this post you need to login first.


You must be Logged on to comment or reply to a post.

  1. Former Member
    Hello Daniel,

    thanks for your detailed information about the new navigation cache.

    Is it possible to invalidate the navigation cache via API?

    It would be great to remove a navigation node (role) from the cache by code.

    Is that possible?

    Best regards,

    1. Detlev Beutner
      Hi Michael,
      This works somewhat like this:

      ICacheConnectorManagement cacheConnector = null;
      INavigationConnector connector = navigationServiceInstance().getConnector(connectorName);
      cacheConnector = (ICacheConnectorManagement)connector;
      cacheConnector.notifyConnectorEventAndWait(clusterNodeId, ConnectorCacheClusterEvents.CLEAR_SELECTED_ENTRIES, eventDataMap);

      Compare the decompiled method clearSelectedCacheEntries in Class CacheAdministratorController for example (and around).
      Hope it helps, Detlev


Leave a Reply