Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
mo_ezzat
Advisor
Advisor

Introduction:


How many times did a user come to you and tell you something is wrong with the report or the dashboard, then you go to check on the source infoproviders and find out that they have failed for some reason?

In a fast paced environment that is highly dependent on BW, you can not afford being informed by users that something went wrong, you need to be in full control of your environment, know immediately when something fails, so that you can minimise the amount of work needed to get everything back to normal.

Monitoring process chains can be easy in case they were few or dependant, you can set up the chain to send you an email in case of its failure, but what if you were doing some modifications and forgot to reschedule it then you notice when its too late. What if you had an environment with too many 3.x data flows where the scheduling occurs within the Infopackage itself, scattered all over the info areas in your system?

In my opinion the best way to monitor the loading jobs in your system is through tracking the execution of the infopackages and DTPs

In this post I will be walking you through developing a Web intelligence report, informing you about the last loading time and the last status of every infopackage and DTP in your system.

The result will be a PDF sent to you daily via email, as the screenshot below:


Step by step guide:


Used Tables:
























Table Type Use
RSREQDONE Transactional Main table, contains all the infopackages and DTP requests in the system with their execution time and status.
RSLDTDONE Texts Contains Infopackages texts.
RSBKDTPTH Texts Contains DTP texts.


Create generic data sources based on the 3 mentioned tables:



  1. Go to tcode RSO2, you will most probably need an access key, ask your basis team to provide one.

  2. You will just need to enter the table name, the structure will be fetched automatically once you click save.




Once you click save you will be taken to a screen where you can identify the selection fields, pick fields DATUM and LOGDPID, these are the date and infopackage technical ID fields.



Note that you can pick anything else that will be helpful in your point of view.

Repeat the mentioned steps for the remaining 2 text tables.

Creation of target infoproviders:


I created infoobjects and key figures for the fields I thought would be useful for now or in the future.

Characteristic InfoObjects:






















































Technical Name Description Type With Texts
ZDTP Data Transfer Process CHAR(30) Y
ZINFOPACK Infopackage CHAR(30) Y
ZTGTSTATE Status CHAR(4)
ZSRC Datasource CHAR(45)
ZUSERNAME User Name CHAR(12)
ZTIMSTRT Timestamp Start TIMS(6)
ZTIMEND Timestamp End TIMS(6)


Keyfigures:
















Technical Name Description Type
ZLNSTRANS Number of lines transfered Number


 

Target Infocube


I chose to make a virtual info cube based on DTP, so that it doesn’t depend on any loading process, each time the query gets executed it will go fetch the data from the data source.

The structure is as follows:


Transformations:


Masterdata Texts:





















Infoobject Datasource Mapping
ZINFOPACK ZINFOPACK_TEXT (Table: RSLDTDONE)

LOGDPID -> ZINFOPACK

LANGU -> 0LANGU

TEXTIP -> 0TXTLG
ZDTP ZDTP_TEXT (Table: RSBKDTPTH)

LOGDPID -> ZDTP

LANGU -> 0LANGU

TEXTIP -> 0TXTLG


 

Transactional, datasource ZINFOPACK_STATUS (Table: RSREQDONE):








































































Infoobject Mapping Field Routine Comment
ZINFOPACK LOGDPID -> ZINFOPACK

IF SOURCE_FIELDS-LOGDPID NP '*DTP*'.

RESULT = SOURCE_FIELDS-LOGDPID.

ELSE.

RESULT = ''.

ENDIF.
LOGPID contains transactions for both Infopackages and DTPs I wanted to separate them.
ZDTP LOGDPID -> ZDTP

IF SOURCE_FIELDS-LOGDPID CP '*DTP*'.

RESULT = SOURCE_FIELDS-LOGDPID.

ELSE.

RESULT = ''.

ENDIF.
DTPs are always prefixed with DTP_, CP stands for contains pattern, equivalent to Like in sql, NP stands for doesn’t contain pattern.
0CALDAY TIMSTAMPBEGIN -> 0CALDAY
ZTIMSTRT TIMSTAMPBEGIN -> ZTIMSTRT

DATA: TIM TYPE C

LENGTH 15.
TIM = SOURCE_FIELDS-TIMESTAMPBEGIN.
RESULT = TIM+8(6).
Time stamp truncation doesn’t happen automatically.
ZTIMEND TIMSTAMPBEGIN -> ZTIMEND

DATA: TIM TYPE C LENGTH 15.

TIM = SOURCE_FIELDS-TIMESTAMPEND.

RESULT = TIM+8(6).
ZTGTSTATE TSTATUS -> ZTGTSTATE
ZLNSTRANS RECORDS -> ZLNSTRANS
ZSRC ODSNAME -> ZSRC

DATA: LEN TYPE N LENGTH 2.

LEN = STRLEN( SOURCE_FIELDS-ODSNAME ).

LEN = LEN - 3.

RESULT = SOURCE_FIELDS-ODSNAME(LEN).


To remove _MA from the end of the datasource’s name.

Provides the datasource for the infopackages only.
ZIPAKSTAT QMSTATUS -> ZIPAKSTAT Can be skipped, use TSTATUS for both IOBJs.
ZUSERNAME QMUNAME Providers the user name for the DTP execution only.


BEx Query


Let’s start off with the query restrictions, as shown below:



  • E4385 is my username, as I don’t want to see anything executed by me.

  • Target status, let me explain what each code means:


































Code Meaning
@08@@ Successful processing
@08\ Successful Processing – Inactive request
@09@@ Waiting
@09\ Waiting – Inactive request
@0A@@ Error
@0A\ Error – Inactive request


We don’t need to see the inactive requests, hence I excluded @08\, @09\ and @0A\.

  • Finally, this is the most important filter as the provider is virtual, I chose to retrieve all the days starting from today until 2 weeks back, I chose 2 weeks in case I was on a vacation or in case I didn’t notice something, it’s good to have yourself buffered.


This is the query structure:


Finally, the Web Intelligence Report:


The main aim of all of this was to create something very simple that I could go through every morning in a couple of seconds and spot if something was wrong.

Trying to simplify the output as much as I could, I chose those fields, technical ID, description, last loading status and the last loading date for this package, user name field for the DTPs, to make sure they were not some test activities done by BW developers (This can be avoided by filtering on BWALEREMOTE and UNASSIGNED in the Bex Query, filtration on the UNASSIGNED will be required as all the infopackages come with UNASSIGNED user).

The report has 2 sheets, one for the infopackages and another for the data transfer processes, as shown below:





 

There are 2 simple tricks:

First, converting the status codes into images, which was done through conditional formatting rules, we need to create 3 of them, one for successful status, one for warning and last one for error, a prerequisite for this is taking a screenshot of the 3 images from the the load monitor within BW, you can use any image of course, this is how the format rule will look like.







 

Common step in the 3 rules, click on format, align text right and make its colour white.

Make sure that all 3 rules are applied, go to the status cell, then navigate to Analysis->Conditional

->Formatting Rules



 

The second trick is showing the max date for each infopackage or DTP, regardless of its status, let me clarify this point.

Suppose we have an infopackage that got stuck in the wait status, or that failed and then you ran it manually and it succeeded on the same day, you will have 3 entries on the max date for this infopackage, each one of them is a unique row as it represents the max date for each status, if this is readable to you, you can leave it as it is, however it will be easier skim and scan through all the records quickly if you eliminate them, these are the steps to filter them out.

  1. Sort the table by the infopackage ID, ascending or descending however you like.

  2. Make a variable measure and put the formula below:=If [Infopackage - Key] <> Previous([Infopackage - Key]) Then Max([Calendar day]) ForEach([Infopackage - Key];[Infopackage - Long Text];[Target Status]) Else 0This will flag the duplicate record with a 0 instead of the max date.

  3. Filter out the 0 values


Note that you will need to create another measure for the DTP sheet and repeat the previous steps.

This works only with measures, I tried doing the same with the dimension flagging those records with an X, filtration did not work.

Save the report on your BI platform and schedule it to be sent to you daily whenever you want in PDF format.

Done!! You are all set.

General Notes



  • Pay attention to the date, just because everything is green doesn't mean that nothing went wrong, if you notice a delay go check that object out.

  • An alternative way to filtering out the duplicates is sorting descending by Status, you will have red ones, followed by yellow then green, this will be the safest way to spot out the failures.


 

This report proven to be very useful to me, so I thought I would share it with you, this community has guided me through a lot and the best way to thank you is by contributing back, I hope I have made everything clear enough and I hope it serves you as it did to me.
2 Comments
Labels in this area