SAP Process Integration Test Tool (PIT) – What is new with SP 23?
As usually shortly after new SP release, here is your regular update on recent Process Integration Test Tool features coming with SP23. You can expect a nice extension of the scheduled data extraction, better support for housekeeping and administration of the PIT system and a first step towards Integration Flow – PIT integration.
With the previous SP 22 we introduced a new feature which allows to schedule one-time test dataset extraction at pre-defined time in the future. But what if you want to extract some data on regular basis, maybe every day or every month? This is now also possible with a new SP 23 feature.
Open the ‘Add Dataset’ wizard in the ‘Test Data’ tab of test case editor. By the way, the wizard was redesigned to provide a better user experience also for other extraction use cases.
Provide test dataset name and select configuration object, as you already know it from the past. Navigate with ‘Next’ to the second wizard page. Select the option ‘Periodic extraction at scheduled date’ in order to extract multiple test datasets over a period of time. Define date and time when the extraction job shall be executed using the schedule configuration.
In the next step you must define the relative time range from which the messages will be extracted. If desired, you can configure additional message filter options.
With the above settings you can for example define that the extraction runs on daily basis in the evening and considers the messages of the whole day.
As soon as you finish the wizard, a scheduled task of type ‘Periodic Extraction’ is created and becomes visible in the ‘Scheduled Tasks’ view.
‘Test Data’ tab in the test case editor shows a new periodic extraction object, it is symbolized with a double-envelope icon. A periodic extraction is a bracket across periodically extracted test datasets. Over time, each newly extracted test dataset will become visible underneath the corresponding periodic extraction node once it is extracted. The name of each test dataset consists of the periodic extraction name specified during the creation, followed by the date of its extraction.
Double-click a periodic extraction object to open the periodic extraction viewer which shows details of the object, its schedule settings, and the list of extracted test datasets.
Additionally, each periodic extraction object has a dataset pointer which acts as a placeholder for a test dataset. A dataset pointer can be selected in a test case run configuration instead of a real test dataset. So far, only dataset pointer of type LATEST is supported, it provides the most current data that has been extracted in a periodic extraction. The dataset pointer LATEST is automatically updated by the runtime after each new test dataset extraction.
A periodic extraction offers new possibilities how to extract and to cut your test datasets. In the runtime, messages are typically available only for a short time, often only for 1 day once they were processed successfully. You must monitor the data closely and must extract timely, if you want to get different message flavors.
Now you can schedule periodic extraction and fetch the data regularly from the system, so that messages are available in PIT (where they are not deleted automatically). At a later point in time, you go through your extracted test dataset and merge a selected subset of the extracted messages into one or more new test datasets. For example, you can create a test dataset containing messages triggering different routing conditions, or you create a test dataset containing only huge messages. Afterwards the original test datasets can be deleted, leaving only your hand-selected and tailored test datasets.
Also, the combination of scheduled periodic extraction and scheduled run configuration with LATEST pointer opens a completely new way of testing by always using fresh data for your test.
Automatic Cleanup Using Retention Policy
Another new feature, which is available with SP 23, helps you to housekeep your PIT system and to cleanup obsolete data.
In case you have many tests which are executed regularly over a longer period, you will notice a continuous growth of the used data base space for execution data. While some test data is essential and must be kept e. g. for governance reasons over years, there might be other data which become obsolete after a short period of time and can be removed without any loss. Of course, it is possible to delete results also manually if they are not needed any more. But often the cleanup task is, let’s say, very unpopular and so mess data accumulate in the system over time.
A retention policy allows you to specify how long the execution data must be kept before it can be deleted automatically. A retention policy is assigned to a scheduled task and applies to all execution and verification tasks produced by it. If a run configuration is executed without scheduled task the results get default retention policy assigned.
The actual deletion of data is executed as soon as an administrator schedules the PIT Delete Job in the NWA Job Scheduler. Alternatively, re-execute the updated CTC template ‘PIT Test Tool’, which would schedule the delete job as well. We recommend scheduling the PIT Delete Job on a regular basis.
Once the PIT Delete Job is running, it checks if there are tasks whose retention time is already expired according to the assigned policy. If yes, the task is deleted automatically by the job including all its messages, error messages, comparison results etc. The job log in NWA as well as the audit log show details like how many tasks of which type were deleted.
You can manage retention policies in the Retention Policy Overview (go to menu Process Integration -> Retention Policy Overview). Right after delivery only retention policy ‘Default Policy’ exists in the system and is configured to retain all data for unlimited time frame. That means, data will not be deleted automatically. Default policy applies in all cases where no retention policy is explicitly assigned.
It is possible to create new / to edit existing retention policies. Only users with action edit_local_retention_policies assigned are allowed to manage retention policies. Per default the action is assigned only to the role SAP_PIT_ADMINISTRATOR.
Create a new retention policy in the Retention Policies Overview using ‘New Retention Policy’ button. Specify a name and how the data shall be handled – either you define a timeframe for which the data is retained, or you decide that data may never be deleted.
When you create a new scheduled task, you can assign a retention policy to it. The assigned policy will apply to all tasks produced by this schedule.
Alternatively, you can assign a retention policy to an existing scheduled tasks in the Scheduled Tasks view. Select the task and choose “Change Retention Policy Assignment”.
Keep in mind, that retention policies are evaluated only during the automatic deletion of data. Manual deletion of execution and verification tasks is allowed any time independent of the retention time frame.
Sometimes you might not want to delete your results but have the contrary requirement to protect your results from being deleted, either automatically or manually. You now have the possibility to protect single jobs from being deleted. The protection applies for automatic cleanup as well as for manual deletion. As long as a job is protected it may not be deleted. The protection is also valid along the hierarchy – if a verification job is protected the corresponding execution job can’t be deleted as well.
You can protect a single job by selecting context menu ‘Protect’ in the job browser or in action log.
You can also protect all (future) executions of a scheduled task if you enable protection in the scheduled task during the creation.
A protection for already existing, not-finished scheduled task can be also configured from Scheduled Task view. Select context menu ‘Change Retention Policy Assignment’ and enable checkbox ‘Mark all future job executions as protected’ in the dialog.
Retention policy and protection have a fine-granular permission concept, which can be shortly summarized as follows – only an administrator is allowed to create/edit retention policies, each user is allowed to assign retention policies to its own scheduled task, each user is allowed to enable/disable protection for its own jobs/scheduled tasks. Please consult the documentation if you want to know more about permissions.
Create Test Case from Integration Flow
Last but not least, an interesting SP 23 feature is a new wizard which allows you to create a PIT Test Case directly out of an Integration Flow in the SAP Integration Designer. A prerequisite is that a connection to a PIT system is already configured in your NWDS and the system is maintained as a test system in PIT.
In the SAP Integration Designer select an Integration Flow in PI Explorer and choose ‘Create Test Case’ from context menu.
Select PIT system you want to connect from the list of available systems in the first wizard page and provide user name and password (user shall have PIT permissions assigned).
Now you only need to enter test case name and choose the corresponding pattern. Press Finish to create a new Test Case in PIT. Afterwards you can switch to Process Integration Test perspective and continue with test configuration, like adding further configuration objects to it, extracting the data etc.
Please keep in mind, that test system representing your Integration Flow must exist already in PIT. The matching is done only based on system ID.
More Information about PIT can be found in SAP Help Portal and in other blog posts out of the PIT series: