Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Matthew_Shaw
Product and Topic Expert
Product and Topic Expert
 

Having recently published an article and sample solution that downloads the SAP Analytics Cloud Activities Log, I thought I'd make a few adjustments so to enable the same for the BTP Audit Logs.

So, that's what I've done and this sample solution is available now.

The BTP Audit Logs had an extra challenge which was to dynamically parse the JSON within the 'message' and turn each of the name/value pairs into a column of the generated .csv file. Whilst this solution doesn't load the data into a database, and so may fall short for some, I suspect the feature of turning the nested JSON in to columns will be of particular interest to many.


Nested JSON message is dynamically parsed and each name/value pair turned into a column


My solution uses nodejs and Postman. If you’re happy to use nodejs (which is available on the SAP BTP platform too) and the Postman libraries ‘newman’ then this sample solution is ideal.

 

The benefits of this sample are:



  • Command line interface

    • enabling automation

    • removing human effort and error





  • Files created

    • by time period, rather than by size

    • which are truly complete without duplicates in different files

    • with a consistent time zone of your choice eliminating a missing or duplicate hour

    • in a way that enables 3rd party tools to easily read them

    • with a column for each element of the nested JSON message allowing you to easily access any element within it





  • Design and error management

    • enabling a known result




Adopt and consume


I’ve done my very best to remove as many barriers as I can, and it means you can immediately adopt and consume the Audit Log API

  • All the thinking and best practices has been done for you

  • No need to develop or write any code

  • No need to understand how the API works

  • No need to fathom various complexities

    • How to parse the nested JSON (and for NEO, how to extract JSON from the message)

    • daylight saving time zones, error handling, very rare API errors etc.



  • Detailed step-by-step user guide to get it all working within about 2 hours

  • Shared best practices and known issues to ensure best possible success


 

My meticulous attention to detail should mean you are probably going to avoid a number of rare errors and you may even resolve some exceptionally rare issues with the manual download too! I hadn’t realised how complicated downloading a bunch of logs really was until I dived into the detail and thought about it for weeks on end!

You’ll find an article below on this solution, but I’ve also added in a load of other related topics. I’ve compiled a list of best practices covering many areas including, the sample script itself, the audit log in general, a bunch of FAQs, and for the developer an overview of how the solution actually works.

Feedback is important


Administrators just like you would very much value feedback from other customers on this solution. Please find the time to post a comment to this blog if you adopt it and hit the like button!  As a suggestion, please use this template:
We have this number of audit logs: xx
We use this sample with this number of SAP BTP subaccounts: xx
We use the option(s): ‘all’, ‘lastfullmonth’, etc.
We use time zone hour/mins: xx hours / 0 mins
We use a scheduler to run this sample script: yes/now
We use the dynamically generated columns derived from the JSON: yes/no
We saved this much time using this sample, rather than developing it ourselves: xx weeks/days
We rate this sample out of 10 (1=low, 10=high): xx
Our feedback for improvement is: xx

Your feedback (and likes) will also help me determine if I should carry on investing my time to create such content.

Feel free to follow this blog for updates.

Simple demo


A simple demo of the command-line interface is shown here. There are 12 other command-line argument options available.


Downloading the audit logs. There are 12 other command-line options



The .csv file generated as viewed in Microsoft Excel. A column is created for each value pair in the message JSON



Resources

















Latest article Version 1.0 – February 2023
Microsoft PowerPoint Preview Slides
Microsoft PowerPoint Download Slides
[Installation and Configuration] user guide Version 0.8 – February 2023
.pdf Download
.pdf Preview
Sample Solution (code) Version 0.8 – February 2023
Github (zip downloadChange log

 

Content


 

Summary of benefits





  • Command line interface

    • enabling automation

    • removing human effort and error



  • Files created

    • by time period, rather than by size

    • which are truly complete without duplicates in different files

    • with a consistent time zone of your choice eliminating a missing or duplicate hour

    • in a way that enables 3rd party tools to easily read them

    • with a column for each element of the nested JSON message allowing you to easily access any element within it



  • Design and error management

    • enabling a known result



  • Immediately adopt and consume the BTP Audit Log API

    • All the thinking and best practices has been done for you

    • No need to develop or write any code

    • No need to understand how the API works

    • No need to fathom various complexities

    • How to parse the nested JSON (and for NEO, how to extract JSON from the message)

    • daylight saving time zones, error handling, very rare API errors etc.

    • Detailed step-by-step user guide to get it all working within about 2 hours

    • Shared best practices and known issues to ensure best possible success




Why download audit logs?


 




  • Audit logs are typically required for compliance and internal cross-charging needs

    • Thus, there’s often a need to download these logs as old logs get automatically purged by SAP

    • there’s no easy way to query the logs, and for NEO there is no audit log viewer



  • Managing the logs can be problematic

    • Manual effort required to download them

    • Download often spans multiple files

    • Time zone is the local time zone and changes on daylight savings causing extra complexity to ensure you have a complete set of logs without duplicates in multiple files, and without a missing ‘daylight saving hour’

    • Unable to easily access values inside the nested JSON message



  • Leads to the following requirements

    • Automated download

    • Easy way to determine if you have a complete set of logs

    • Known result is critical

    • Time zone support

    • Both .json and .csv file creation

    • Dynamic generation of columns for the .csv, one column for each attribute within the JSON message

      • The columns must not shrink over time, just because certain attributes are not currently populated





  • Different organisations have slightly different needs

    • Some require daily logs, others weekly or monthly






Introducing a sample script to download audit logs





  • Sample script to download the Audit Log

  • Command line that downloads the logs and generates dynamically named files, based on time (day, week, month)

  • Uses SAP BTP REST API

  • Sample comprises of:

    • JavaScript (.js) file

    • Postman collection (for downloading the audit log)

    • Postman environment (to define your SAP BTP Audit and OAuth client details)

      • the .js calls Postman ‘as a library’

      • this, in turn, requires nodejs









  • Solution is provided ‘as is’ without any official support from SAP

  • It is thus a community driven solution


Logic of .csv & .json filenames





  • oldestday logs are for the day containing the oldest log entry

    • The logs for this day will be incomplete

    • Since logs earlier in that day are not present, they have been deleted either manually or automatically by SAP

    • Filenames will have _tail.csv _tail.json



  • fullday logs are periods where:

    • 1) there is a log entry in the preceding day

    • 2) todays day is not the same day as the log entry itself

    • Filenames will have _full.csv _full.json



  • currentday logs must also be incomplete because the day is not yet over

    • Filenames will have _head.csv _head.json



  • This logic applies to periods ‘weeks’ and ‘months’ in a similar way as for the ‘day’

  • Only fullday, fullweek and fullmonth logs are ‘complete’ and contain no duplicates

    • Files will not be created, if there aren’t any activities for that period

    • This means you could have ‘missing’ files, but this is because there are no logs for that period




Demo



Command line options overview


4 main options, each option provides day, week and month variants

  • Oldest: oldestday oldestweek oldestmonth

    • Creates a single _tail.csv/.json for the oldest (incomplete) period



  • Eachfull: eachfullday eachfullweek eachfullmonth

    • Creates a file _full.csv/.json for each and every ‘full’ period



  • Lastfull: lastfullday lastfullweek lastfullmonth

    • Creates a single file _full.csv/.json for the last ‘full’ period

    • Ideal to run daily, weekly or monthly



  • Current: currentday currentweek currentmonth

    • Creates a single file _head.csv/.json for the current (incomplete) period





day



week



month



Error management





  • The sample script manages a good number of errors including but not limited to

    • session accesstoken timeouts (http 401 status codes)

    • x-csrf-token timeouts (http 403 status codes)

    • server too busy (http 429 status codes) and will wait and automatically retry until successful

    • internal server error (http 500 status codes) and will wait 20 seconds and automatically retry 6 more times*

      • * not applicable for NEO, only applicable for Cloud Foundry





  • Script is designed to handle these errors at any point in the workflow providing maximum stability


 

  • Unhandled errors could be the result of

    • an error on the part of SAP BTP

    • OAuth authentication issues

    • API availability

    • network or connection issues



  • Anything unhandled will prevent all logs being written out, meaning failures will not corrupt existing files making it safe to repeatedly attempt to write-out log files previously downloaded

  • Errors returned from the API are written to the console helping you to understand the root cause


 

  • The sample script will return an:

    • exit code of 0, if there were no unhandled errors. Everything went as expected

    • exit code of 1, if there was any unhandled error that occurred during the download of the audit logs

      • No .csv/.json log files will be created and means no files will be ‘half written’

      • If checking for this exit code, wait about an hour before running the script again, it will give SAP BTP time to automatically rectify and recover any issues on its part






Getting a known result



  • Extension to the options: lastfullday | lastfullweek | lastfullmonth

    • Has an additional option [periods] which then means a file is created for each of the last [periods]

      • lastfullday 5

        • will create _full.csv/.json files for each of the last 5 full days



      • lastfullweek 3

        • will create _full.csv /.json files for each of the last 3 full weeks







  • This feature solves the problem where the download failed for any reason

    • SAP BTP Cloud Services in maintenance mode

    • SAP BTP Cloud Services technical issue, API unavailable

    • On-premise issue running the sample script



  • You have 3 options

    • 1) Use this [periods] option

    • 2) Or check the exit code and repeat the download until exit code is 0

    • 3) Or combine the above two



  • If not checking the exit code, use these [periods]:

    • Run daily: lastfullday 4

    • Run weekly: lastfullweek 3

    • Run weekly: lastfullmonth 1



  • This way, any failure will be recovered by the next, hopefully successful run

  • It would require a significant and repeated failure to miss any logs



lastfullday



lastfullweek



lastfullmonth


 

Time zone support





  • The time zone of the browser, that is used to download the logs, determines the time zone of the logs’ timestamp *

    • However, the browsers time zone typically changes over a year due to daylight savings

    • This dynamic change causes complications easily resulting in

      • A missing hour of logs

      • A hour of duplicate logs

      • Confusing timestamps for logs before the time zone change, but downloaded after the time zone changed



    • (* Audit log downloads via the web browser user interface is not supported for NEO)





 

  • This sample script resolves these problems by using a fixed time zone of your choice

  • It means

    • you get a known result, although you need to decide which time zone you want for the entire year

    • daylight saving changes have no impact

    • all the dynamically created filenames are based off the time zone of your choice

    • It doesn’t matter where in the world you run the script, if you maintain the same time zone setting, the script will always create files with the same content



  • The sample script can, like the manually download method*, use the local (often dynamically changing) time zone but this isn’t recommended for the reasons mentioned


All audit logs





  • If you don’t want a file per period, then use the ‘all’ option

    • This downloads all the logs available and creates a single file

    • There is no restriction on the logs downloaded or written out, it is everything

      • Filename contains the start and end datetime stamp

      • Option (see later) to use a fixed filename






.csv Generation: Nested JSON is expanded to expose individual attributes



AuditLog Viewer (available on CF only)




  • The ‘message’ content contains nested JSON *

  • For ‘csv’ file generation, each attribute of this nested JSON is expanded into a column

  • The columns are dynamically derived and ordered

  • Result is a very neat and tidy .csv file that can be opened in Microsoft Excel, or accessed for further processing by a 3rd party

    • * exception for NEO – see later





Message json is expanded into columns


 

.csv Generation: column_headers.json stores column headers





  • Once the logs are downloaded, the message JSON is parsed to identify all nested attributes

  • All attributes are stored in a ‘column_headers.json’ file

  • These dynamically generated attributes are sorted alphabetically making the .csv easier to read


 

  • This file means:

    • The number of columns generated in the .csv files never shrinks

      • Prevents a problem where future downloaded logs no longer contain previously identified attributes

      • In turn, this means 3rd party tools won’t complain of missing columns



    • If generating files from multiple BTP subaccounts, the .csv files will be consistent across all subaccounts. All .csv files will contain the same column headers, and in the same order

      • This is ideal if combining audit logs from multiple subaccounts



    • You can optionally customise the headers, either by re-ordering or removing headers you don’t want or need



  • The sample script provides options to disable the automatic update and re-ordering of dynamically generated columns

  • Deleting the column_headers.json is safe, if you’re ok to lose previously identified JSON attributes

  • The number of attributes identified will be highly dependent on the BTP services you make use of


.csv Generation: Parent header columns automatically removed





  • All the JSON attributes are identified, including ‘parents’ since the JSON is nested

  • This means the number of dynamically created columns will easily exceed 100

  • However the parent columns are typically not needed since their children contain all their attributes

  • The sample script thus automatically removes all parents (except for the original ‘message’, column K here)

    • This significantly reduces the number of columns making the file more manageable

    • Keeping the main original ‘message’ means that for any attributes not captured, because the option to ‘automatically update’ the columns has been disabled, will not be lost. And you get to keep the raw data



  • The script provides options to disable the ‘automatic removal of all parents’

    • the screenshot shown of this file, was taken with the ‘automatic removal of all parents’ option disabled, just so you can see an example of parents in both the .csv (within Microsoft Excel) and column_headers.json






.csv Generation: Exception for NEO



  • The ‘Message’ property (J) for Audit Logs on NEO is different to that for the ‘message’ property on Cloud Foundry:

    • The value as a whole is not valid JSON like it is on Cloud Foundry, instead it contains amongst the text, valid JSON

    • By default, an attempt is made to extract all the valid JSON and store this in a new column ‘message’ (column K)

    • This extracted JSON is then processed, just as described earlier, into a column for each attribute (from column K into L, M, N, …)







  • Since the original Message (J) is poorly formed, during the exaction for valid JSON, an attempt is made to identify the ‘name’ for the JSON ‘value’

  • In the example ‘Auditlog Retrieval API’ is found before the valid JSON and this is used as the ‘name’ for that JSON ‘value’

  • The name is found from text preceding the JSON and between the first two quotes (“) found, with spaces replaced by underscores (_) as a name can’t have spaces in it

    • The “%void” is not selected since it contains a special character (%), though a very simple modification to the regular expression in the script, would allow it



  • This is only applicable for NEO Audit Logs and this extra feature can easily be disabled if required. Consider this feature ‘experimental’





  • If the .json file is generated, then it will also have this extra column it, meaning it has a ‘Message’ (J) and a ‘message’ (K) which is handy for any subsequent JSON processing


.csv File structure





  • The .csv files are not actually comma (,) separated since the data contains commas too

    • Using a comma would make the file inconsistent

    • Instead a hash (#) character is used which solves this problem



  • So the file can easily be opened in Microsoft Excel a line ‘sep=#’ is added before the header

    • This instructs Microsoft Excel to open and read the file as you’d expect without any extra effort to specify how the file is structured

    • Just a double-click is needed to open the file and it will open and display just as you expect



  • Options are provided to change the file delimiter and to remove the ‘sep=’ line if required


.csv/.json File structure: time format





  • Both .json and .csv files are created with a consistent time format, in the fixed time zone ofyour choice

  • Consistent for both NEO and Cloud Foundry


 



  • The Audit Log Viewer creates .json files with the time in a format that includes the relative time zone

  • The API (for both NEO and Cloud Foundry) returns the time in a format that includes the time zone, but its always UTC(+0)

    • This sample script uses the API to download the logs



  • The problems related to time zones is resolved using a fixed time zone of your choice and the time is written with the time adjusted accordingly but also with the time zone information removed (the T and Z parts)

  • The time value within the ‘message’ is not adjusted and remains unchanged (applicable for Cloud Foundry, as NEO does not have an extra ‘time’ within the ‘Message’)

  • For NEO Audit Logs the API returns the ‘time’ in a different format compared to Cloud Foundry. NEO has dots (.) instead of colons (:) between hours, minutes, seconds. This sample script normalises the time format so that any JavaScript ‘toDate()’ function works without further modification. Means that any post-processing of either the .json or .csv is likely to be easier

  • This consistency is needed to help consolidate logs across subaccounts and across Cloud Foundry and NEO platforms


.csv/.json File structure: time order





  • The Cloud Foundry Audit Log Viewer creates .json logs in an descending time order, oldest to newest

  • This sample script creates .csv and .json files sorted in ascending time order, newest to oldest


Designed for customisation






  • Change the default option and default [periods] option

    • Saving you the need to provide command line arguments



  • Change the folder & base filename to your preference

  • Change the friendly name of the SAP BTP Sub Account Service

    • This friendly name is used to form the filename



  • Duplicate these for each BTP Subaccount you have

    • Update the ‘environment’, that points to the exported environment.json within the JavaScript accordingly

    • Specify the Audit Log Retention Duration in Days

    • Specify the time zone hours and minutes in the environment too







  • If you’d prefer a non-dynamic filename (handy for scheduling jobs that read the file) then enable that here, with the filename of your choice

  • Generate json and/or csv files (both or just 1 is fine)

  • The ‘csv’ file delimiter

  • Add the Microsoft Excel ‘sep=’ line or not

  • Dynamically parse the JSON and generate columns automatically

  • Option to remove the ‘parent’ columns
    Option to always keep the original ‘message’ column

  • Nicely order the dynamically created columns

  • Name of the file to store columns headers

  • If on NEO, try to identify the ‘name’ part in the message text for JSON values


Related recommendations and best practices





  • Use the sample script to download logs on a regular basis
    Validate the sample script is working as expected
    For others to benefit, share your experience and ideas via comments & likes to the blog





  • OAuth Client management

    • Keep the OAuth password top secret and change it on a regular basis (like all other OAuth passwords). Change the password every 90 days

    • The OAuth password is stored in the environment file thus restricting access to this file is necessary

    • For the NEO platform: grant only the API and scopes necessary




Installation & configuration


Follow the detailed step-by-step user guide but at a high level:

  • Download & install (30 mins)


  • Configure (30 mins)

    • Inside nodejs:

      • install newman (Postman libraries) with npm install -g newman

      • so the script can create files, link filesystem libraries with npm link newman



    • As an admin user, create a new OAuth Client inside BTP subaccount

    • With Postman app:

      • configure the Postman environment, including time zone, and validate the OAuth setup is correct

      • export the validated Postman environment



    • Configure the .js sample script and validate file references



  • Run the sample inside nodejs with node nodejs_4202_download_btp_audit_logs.js (60 mins)

    • Investigate and validate various options available




Frequently asked questions


Q: Is there an OData API to the audit log and can I use SAP Analytics Cloud to query the audit log


A:

  • No, not directly. The sample script uses a REST API to download the audit logs, there is no OData API that SAP Analytics Cloud or Data Warehouse Cloud can consume
    This sample script creates .csv files that are perfectly formed meaning the original nested JSON ‘message’ turns into a column for each attribute making it ideal to be read by other tools or services

  • Once the audit logs have been created, you’ll need to load them into a database unless you’re happy to use a client tool like Microsoft Excel

  • There are many options available to you that can read the .csv, and these include SAP Analytics Cloud and its Cloud Agent to or a replication flow in SAP Data Intelligence to load the logs into SAP HANA Cloud. If needed, edit the .js file and set use_fixed_filename=true so the agent can pick the same filename each time

  • Another alternative is to use SAP Cloud Integration and load it directly into SAP HANA Cloud (related blog and blog)


 

Q: Where can I find official information about the audit log API’s


A:

 

Q: Is the Audit Log API available on the SAP NEO and Cloud Foundry platforms


A:

  • Yes, and this sample script works with both


 

Q: Is it possible to integrate other applications with the BTP Audit Log For example can Cloud Application Programming (CAP) write logs to the Audit Log


A:

  • No, only SAP BTP Services designed to write to the Audit Log can write logs to it. There is no API or means to write custom logs to the Audit Log


 

Q: What is the Audit Log retention duration


A:

  • For Cloud Foundry, the Audit Log Retention is 90 days and there is no option to change this (see official documentation)

  • For NEO, the Audit Log Retention is 201 days and there is an option to change this (see official documentation and information for how to change the duration)


 

Q: What are the API endpoints for the BTP AuditLog and how do I get started


A:

  • Please refer to an earlier question/answer for the official API reference documentation

  • However, a more practical step-by-step instructions are provided by the user guide for this sample

  • The user guide will take you through how to obtain all the necessary endpoints and to use pre-built Postman Collections enabling you to easily get started with the API. All the thinking has been done for you, for example the Postman Collection includes all the necessary code to get an access token, fetch all the ‘pages’ of logs and handle errors. Its an ideal starting point for any developer. Please follow the user guide and use this sample, since the Postman Collection provides a fully functioning solution it will most likely answer almost all your questions. The troubleshooting section of the user guide provides details for which Postman Collection to run and how to run it


 

Q: Are the ‘columns’ of the Audit Logs consistent between NEO and Cloud Foundry



A:

  • No, the ‘columns’ are different, though some are similar in nature

  • This sample script can create .csv files that have the same column headers, regardless of the Platform hosting the Audit Log service, however the sample isn’t really designed for this. For example, it would only be the ‘message’ column that would be populated for all audit logs since this is the only column name that is shared. And this ‘message’ column on NEO is dynamically created by the sample when it extracts valid JSON from within the original ‘Message’

  • Though, the sample script normalises the time format and the time zone across NEO and Cloud Foundry platforms which is a tiny step towards solving this problem


 

Q: For the logs written to the .json/.csv files, are they filtered in anyway


A:

  • No. The audit logs are not filtered in anyway, except by time

  • This is unlike the Auditlog viewer available on Cloud Foundry, where the logs are limited to 5000 entries

  • For NEO, the API does support filtering on the ‘Category’. Please refer to the user guide for this sample to enable filtering of Category if you’d like the sample to generate files pre-filtered by Category – its very easy, just uncomment a line of code and export the Postman Collection

  • If you are using Cloud Foundry then the API doesn’t allow for the logs to be filtered in any other way, except by time. So you always get all the logs


 

Q: Is there a way to specify which audit events should be logged


A:

  • No. SAP decides which events are audited and captured in the audit log

  • The easiest way to determine if an event it logged or not, is to perform the action and inspect the log

  • For Cloud Foundry more information on what is captured is available in the documentation


 

Q: How does the .json contents of the files compare between manual download with this command-line download


A:

  • The contents of the file are almost identical

  • However, the .json files created by this sample script are different when compared to the manually downloaded option:

    • The logs are in reverse order compared to the manual download option. They are in ascending time order, newest to oldest

    • The time is in a fixed time zone of your choice and the time zone information is removed (the T and Z parts)

    • The time format between hours, minutes and seconds, is in a format without dots (.) but with colons (:) making any subsequent processing easier (its only NEO that uses dots) meaning the time format is consistent across NEO and Cloud Foundry

    • For the NEO environment, if the option sample script option ‘NEO_try_to_find_JSON_attribute_name_if_missing’ is enabled, then an extra JSON name/value pair will be added. The name will be ‘message’ and the value will contain valid JSON that was found in the ‘Message’ value. This makes any subsequent processing of the JSON easier




 

Q: Is the sample solution secure


A:

  • The audit logs are downloaded over a secure HTTPS connection

  • Once the .json/.csv files are generated, it is for your organisation to secure them. The files contain personal data and this requires extra care to ensure compliance with local data privacy laws

  • The OAuth client id and secret (username/password) are stored as clear text in the environment .json file and so access to this file should be restricted


 

Q: When using the sample script to download the logs, why is the content of files different after I change the time zone setting


A:

  • Because entries in the logs will fall into different periods (days, weeks, months) depending upon the time zone of your choice. As you change the time zone setting, some of the log entries will fall into different periods of time and this will change the contents of the files created

  • It makes no difference to where you run the script, the local time zone is ignored, only the time zone defined in the Postman Environment is used (see the user guide for more details)


 

For the developer - How the sample script works



Overview







        • The Postman environment is loaded into the JavaScript node.js container

        • The Postman collection is then called as a library function passing into it an optional filter on time ‘time_from’, to limit the volume of logs. The filter is passed within a request parameter to the SAP API. This reduces the number of logs and pages that need to be returned but it also ensures all the requires logs are returned as the default is 30 days





        • The Postman collection makes requests to get an accesstoken

        • Then the logs are downloaded, page by page, until they are all downloaded





        • After each page request, the data stream response is processed changing the timestamp to the desired time zone

        • If on NEO, an attempt is made to remove valid JSON from the ‘Message’ and any valid JSON is then added into a new ‘message’ property (column)





        • Once all the data has been downloaded successfully:

        • If the column_headers.json exists, it is read

        • The ‘message’ JSON is parsed to identify all attributes and it updates the column_headers

        • ‘Parents’ of the nested JSON are removed and the column headers are sorted before storing them into column_headers.json

        • The json & csv file(s) are created by splitting the logs based upon the desired period (day, week, month)

        • The success or failure is written to the console (errors can be pipped) and the exit code is set








 





      • The Postman Collection manages all sessions, timeouts and most errors resolving and recovering occasional ‘wobbles’ when using the API. This improves overall stability and reliability. Unrecoverable errors are returned to the JavaScript via the ‘console’ event and then passed back to the main console allowing you to see any underlying unhandled API errors






 
3 Comments