Housekeeping Jobs_SAP BPC NW
Performance is the key factor in SAP BPC ,for each transaction we perform in the system,many tables will be updating with the logs for that task.
Over a period of time these tables become very large and consume lot of space. Apart from consuming lot of space,has negative impact on performance.
Hence we need to archive/delete the log data after certain period.
Wanted to document all house keeping jobs at one place.
Always switch off parameter BPC_STATISTICS in SPRO settings after BPC performance statistics trace.
Execute program UJ0_STATISTICS_DELETE via transaction code SA38/SE38 to delete obsolete statistics or schedule it as background job as per note 1648137.
2.UJBR Backup and Restore:
As best practice should take backup with UJBR T code weekly once.
We can schedule this job weekly or if you have more than one environment and wanted to run back up jobs, you can create a process chain with the program UJT_BACKUP_RESTORE_UI.
By selecting “Execute Backup” radio button we can take full backup of the environment. In the event of data loss due to any reason, we can restore the environment by selecting “Execute Restore” button.
In the UJBR backup we take full back up weekly. But usually we work mostly on one category (Plan/forecast).Most of the times we may need to restore particular category data for a month or two for some selections. But we don’t have option restore a particular set of data in UJBR restore.
If we take exports and save it in the server, we can import based on our selections.
With “Export Transaction Data to File” DMP we can export the data.
We can import the data with the below 2 DMPs per our requirement.
a.Import Transaction Data Aggregate Overwrite Mode.
b.Import Transaction Data (Last Overwrite Mode).
Lite Optimization process helps to move transaction data from F fact table to E Fact table apart from other activities (i.e., Rollup, Statistics update, Closing the open requests).
This should be scheduled everyday night during off business hours. It improves query performance.
We can switch on Zero elimination in the /CPMB/LIGHT_OPTIMIZE process chain.
Else we can check “with Zero elimination” check box in Cube Manage tab in BW system.
The first option is applicable for all the models in a system, but the second option is for the specific cube (Model).
If Zero elimination is not switched on due to any reason, if you want to eliminate Zero records from the system, you may use “RSCDS_NULLELIM” program.
6.Audit tables House Keeping:
In most of the cases we have audit logs enabled for Administration activity and User Activity.
Based on the purge frequency we have given, these tables don’t purge automatically.
We need to schedule the DMP – BPC: Archive Data(/CPMB/ARCHIVE_DATA) regularly. Based on the purge frequency we have given in audit functionality audit data moves from Audit data table to Archive table.
From archive table we can delete with “UJU_DELETE_AUDIT_DATA” program.
For Administration activity logs we need to use “BPC: Archive Activity” ( /CPMB/ARCHIVE_ACTIVITY) DMP.
This DMP moves data from UJU_AUDACTDET to UJU_AUDACTDET_A ; UJU_AUDACTHDR ,UJU_AUDACTHDR_A table based on the selection given in the DMP. We can delete the data from UJU_AUDACTDET_A , UJU_AUDACTHDR_A with the help of SE14 functionality.
7. Comments and Journals House Keeping:
If Comments are enabled and using journal entries, you may use BPC: Clear Comments(/CPMB/CLEARCOMMENTS), BPC: Clear Journal Tables(/CPMB/CLEAR_JOURNALS) DMPs.
With the help of SBAL_DELETE program or SLG2 we can delete the application logs which are older than 1 year or as per our requirement.
9.UJF_DOC, UJF_DOC_CLUSTER,UJD_STATUS tables:
UJF_DOC table contains transformation files, conversion files, script logics and other documents.
And the flat files which were generated by exports jobs and files uploaded for imported jobs apart from logs generated by DMP execution.
a.We can delete the unwanted files, reports from Data Manager/EPM tabs in Excel.
b.UJF_DOC_CLUSTER, UJD_STATUS tables contain DMP execution logs details. UJF_FILE_SERVICE_DLT_DM_FILES, UJF_FILE_SERVICE_CLEAN_LOGS can be used to delete the Data from these tables.
c.Even if you select ‘Script Logic logs’ for UJF_FILE_SERVICE_CLEAN_LOGS program;the logs with the suffix ‘.lgx’ under the folder ‘/root/webfolders/<environment id>/adminapp/<model id>’ are not deleted.
We need to implement ‘2581931 – Add feature for cleaning script logic logs’ to fix this problem.
d.We can delete the entries in UJFS t code manually as well.
10.Work Status Tables:
Some times we will have entries in the system for obsolete transaction data.(You have locked data for 2010 year and after some time,you have deleted the transaction data.But work status table still contains the data for 2010).
Implement 2053697 note and run UJW_WS_TEST program.If you don’t give selection here all work status entries will be deleted.
1.1470209 – BW report RSCDS_NULLELIM on Info Cube without time dimension
2.1934038:housekeeping of table UJ0_STAT_DTL
3.1705431 – Planning and Consolidation 10.0 NW – House keeping
4. 195157 – Application log: Deletion of logs
5.1908533 – BPC File Service Cleanup Tool
6. 2053697 – ABAP report to remove obsolete work status for data region
8.2581931 – Add feature for cleaning script logic logs