Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member192430
Active Participant
Applicability: SAP BW on HANA, BW/4HANA and Embedded BW solutions.

Business Case: Every BW system running an inventory system wherein the stock inflow and outflow is maintained also needs a solution for the daily inventory snapshot for a particular day, it easy to manage when data volume is less however in Retail like systems, wherein the volume is huge it becomes imperative to store these snapshots(mostly it comes in handy when you are running out of compute space).

Example: Consider you have an article A which you are selling in a Store S, now this article sells like hot cake and needs to be replenished daily so you get the Article A in Store S from a vendor or Distribution Center(DC) daily so ideally you will get 4 KPIs here amount of article A delivered to the store(inflow), amount of article sold(outflow), amount of article at starting of day(Opening balance) and amount of article left after the day closes or EoB i.e.(Closing Balance). We can have a lot more KPIs which can be derived as well however we will talk about these for now.
The opening balance today is usually the closing balance of yesterday for the article site combination so we can compute either of them i.e. the Opening Balance or Closing Balance which gives how much stock or article we are left after the Close of Business on a particular day. This view of stock is called Snapshot. Now assume you have 100,000 articles and 500 sites; imagine the snapshots for just one day and if this data needs to be retained for 1 year then you get 100000 X 500 X 365 = 18.25 Billion records.

Current Scenario/Issue: Today to compute and get the inventory snapshot from Inventory target with non-cumm. KFs depending upon the amount of data, time taken might easily go into hours to take care of any full fledged retail solution.(50 Million records per day from our example). Mostly we use APD to execute the snapshot output with BEx Query captured into Direct update DSO => This process is both time and resource consuming and even the most optimized APD run might take quiet long to yield the output. We faced this particular issue after a BW technical upgrade to 7.4 SP18. The APDs stopped considering the parallelization and execution started consuming 2 hours.

Solution: We can use the given algorithm below to push down entire BW - BEx level computation to HANA to capture the snapshot for today for all the inventory. Based on a few research runs we have been able to compute 7+ Million records in under 3 mins which takes over 2 hours using APD today in BW on HANA system.

Algorithm: Understanding this will need some basic background understanding of MM cube operation via non-cummulative KPIs of SAP BW on HANA. Obviously we have just one fact table now and that itself is being used for snapshot computation in BEx for a particular day.

Algorithm:
1) Compute Reference Day or day on which the last Marker was updated.[RSDDSTATCOND table contains the latest information on cube compression which would also imply marker update normally].

2) Compute CALDAY which will be yesterday's date only.(assuming you are capturing Closing Balance).

3) The push-down logic is simple:
Snapshot on Day(x) = You will need data for all available article site combinations prevalent today which needs to mashed up with snapshot computation for each article-site combination which yields data till yesterday with marker addition based on inflow and outflow equation. The marker value can be obtained from Marker Value =1 corresponding to the date extracted from point 1.

One can now sum it up as its happening in BEx to achieve the summation of (Inflow - Outflow + marker) from Marker date to yesterday's date.

Exceptions: One might find the row count is not matching the BEx output in certain scenarios those are due to historical values which need to be adjusted for the existing Article-Site Combinations.

End Result: For the computation run based on our scripting/design logic stated above, we could achieve the entire snapshot computation of 7+M rows in under 4 mins which was taking over 2 hours with APD based method.

We found that system parameters like available memory, CPU and concurrent loads have a considerable impact on the overall efficacy of this solution as entire computation has been pushed down.


Pros: We found great turn around time on HANA calculation compared to OLAP Manager of BW application layer as we were able to embed this logic using AMDP into a transformation.

Cons: Calculation on the run-time takes a lot of resources in terms of HANA memory and CPU cycles which can be curtailed by using UDF to control the memory spikes during this operation.

Regards,

Abhishek
16 Comments
Labels in this area