Google Chrome’s developer tools (popularly called dev tools) is probably the best way to debug and analyse details of your client side web application. It is very useful to measure and analyse the performance of your app as well. I’ve written a SAP Lumira Data Access extension to acquire some interesting data that can be logged using the dev tools, so that it can be visualized in SAP Lumira. If you’re interested in understanding what I did and why, then read on!
Recently, I was asked to work on a performance improvement project at work. One of the areas of improvement was identified as client side load times in the browser. As a first step, I started monitoring the network calls made by the application using the “Network” tab in the dev tools. I figured out some obvious optimization that could be done. But, I had no way to keep track of my progress. That’s when I accidentally right clicked on one of the items in the network tab, and there it was, “Save as HAR with content” 🙂 . On opening this file, I realized that the dev tools let me export the content of the network tab into a HAR format, which was nothing but a well structured JSON. That’s when I decided that I’ll write a DA extension to parse these logs and visualize the data in Lumira. This extension gave me certain advantages over the dev tools.
- I could load up the data of any session, whenever I liked and analyze/visualize the results
- I could compare between two or more logs at the same time (Most important usage)
- I could use the data available to compute my own dimensions/measures within my extension
That’s sort of the motivation and need behind the extension. I will not dwell in depth about the implementation, as I feel it’s a simple log parser. But, I’ll give a brief idea as to what is happening. I take the following parameters as input to my extension:
- Dataset name: Lumira needs a dataset name during acquisition
- Logs folder: the folder where all the network logs are stored
- Metadata: the path to the metadata related to the column names (I plan to remove this parameter in future commits)
The logs are then parsed one by one, creating a row in the output dataset for every entry found in the entries key. One important point to note is that the log file name is used as the primary key, so that analyzing the data is made easy.
Let me take you through a simple use case to make the usage of the extension clear and also the kind of insights that can be derived. The use case is that I’m trying to understand how a website performs at different network speeds. I’m comparing WiFi vs 3G, which could be considered as a laptop vs mobile performance. First, install the extension as per the instructions at the github repository. I’ve taken two network logs of a famous social website, one over normal WiFi and one by using the throttling feature of the dev tools
The logs are uploaded here as well. Let us acquire these logs using the DA extension.
Once the acquisition is done, let us do a very naive comparison. Comparing the two download times.
As expected, the 3G download times are higher. We can go a step deeper and analyze each and every network call if required.
If you’re still not won over by my extension 🙁 , I list down the various ways I feel the extension can provide insights into your web app’s network activity:
- As the web app keeps getting bigger, more and more changes get added. As a person looking into the network activity, you could take logs periodically and compare them in SAP Lumira to see what is changing
- Network performance on various download speeds can be compared
- In depth analysis of every network call is possible, using the wide set of visualizations available
- The Lumira Server for Teams and Lumira Server for BI Platform support DA extensions. You can install the extension and share your analysis with your colleagues as well
- I plan to introduce dynamic creation of metadata by taking user input. This would remove any need for code changes to add columns
- A Visualization extension can be written that can help visualize the logs data as seen in the dev tools
- Currently, the download times don’t take parallel loading times into consideration. It should sort of be straight forward to implement, by using a merging interval algorithm
- Any addition of columns requires a change in code.
- Metadata needs to be supplied as a separate .txt file
The source code is up on Github. I’ve tried my best to write extensible code, so go ahead and fork it 😉 . Any feature requests or bugs can be created on the repo itself. This is my first blog post and my first open source contribution! I’m really excited about it. Let me know what you guys think!