Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member184466
Contributor

This blog summarizes an ASUG webinar on Data Services and Microsoft BPC integration, given by Son Ha (BBA Aviation) and Ernie Phelps (DecisionFirst). ASUG members can view the recording and attachments here: http://www.asug.com/events/detail/Integrating-SAP-Data-Services-With-BPC-Technical-Overview

  • Using DS 4.1, but should work with any supported version
  • BPC was 10.0.0 on MS SQL 2012

One immediate challenge is that BP only accepted flat files for import. Additionally, they wanted to stay entirely within the SAP support agreement. This meant that methods of loading other than through the BPC SSIS packages had to be discarded.


Method

Using Data Services to move data from a variety of data sources to a staging area for processing is a common strategy. BBA data sources initially included Oracle, SQL Server, AS/400 DB2, Gupta SQLBase, Excel documents and other non-ODBC compliant sources for a total of about 30 sources.


The next 2 slides will show a basic staging job to illustrate the standard elements across the Data Services jobs. Each source system is slightly different and some required minor transforms during staging to distinguish company or other source specific customizations.



The jobs either reload for custom or utility tables. Transactional data is loaded in a loop and run on a monthly basis.


BPC is generally run through Data Manager, instead of providing web services. SSIS can be used to call the SAP custom transforms, so they ended up using Data Services and SSIS.


Data Services components


The requirements for data processing daily include all previous month and current month transactions. To accommodate this and allow easy future changes to look back, we implemented the processing in a loop.


Some subsystems had preprocessing requirements such as grouping or splitting records, sign changes and pre-pending or appending static data. We handle these inside the loop along with lookups to the utility table conversion files and lookups to the BPC system.


The lookups to conversion filed modify values either based on an exact match or partial match. If both exact match and partial match were given for a subsystem, two lookups were utilized with preference given to the exact match.


Lookups were determined dependent on the previous lookups (Ex: Account lookup for Department.AccountNumber). This results ina few more queries to chain the values together. To make this process perform optimally, we preload cache the lookup values and reduce the caches set size using custom SQL on the lookups.


The final lookup for each dimension checks the member tables inside BPC for the existence of the field value (Ex: dbo.MBRHS_ACCOUNT). The result of this lookup is used in processing the output files to create reject files, populate log files and eliminate records from the final load file for BPC.



BPC uses a combination of text and XML like files for displaying processing results inside the application. To reduce the complexity of generating these files, we simply generate lines of text for the header / non row data portions of the files. This is accomplished with a row generation / decode combination.



You’ll see how many records were accepted, rejected, etc. You’ll also see the log file names.


File output

There are four sections in this dataflow.



Rejects (area 1)

Reject processing is pretty straight forward. Just limit to the records that failed the BPC lookups and output one row for each. The row below may look a bit convoluted, but it is just a duplicate of what BPC generates during it’s own processes.


'\[' || ltrim(rtrim(decode(qry_Limt.BPC_ACCOUNT is null, qry_Limt.HS_ACCOUNT, '') ||

',' || decode(qry_Limt.BPC_DEPT is null, qry_Limt.BBA_DEPARTMENT, ''),','),',') ||

'\]' || ' ' || chr(9) || chr(9) || qry_Limt.HS_ACCOUNT


Data file/archive (area 2)

In addition to outputting the data file we also opted to output a copy of the file in an archive area. This aids in troubleshooting, provides backup data files for reload to a certain date and acts as a reference for user research.


The detail for qry_Colm (next slide) shows the output format for BPC. Be sure that the columns are in the correct order and have proper types and lengths. We also do a final group and sum here. The final output files are simple CSV format.



Log file (area 3)

This section of the data flow outputs the log file lines. Because each possible rejected value can have a distinct section in the file we decided to use ordered sets with row numbers to make sure all the rows were in the correct order.


The top and bottom queries (qry_Dist_HS_Accn_Reject and qry_Dist_HS_Dept_Reject) get assigned even numbered order identifiers (2 and 4 respectively). Within this the rows are sorted on source value (account or department identifier). If you have additional possible reject identifiers, simply add another query for each and always use an even number to represent the order identifier.



The middle row generation handles the section headers and file footers. 4 rows are generated in our case, if you have additional reject identifiers you would need to increase this number by 1 for each and assign an odd output row for the section header. 



The data rows provides section headers in the first two lines to identify the dimension rejects are from. The last two rows create the file footer.



Reload (area 4)

Initially only 3 sections were included in the file outputs. This fourth area was added after the BPC Administrators expressed the desire to have the rejects put into a standard load file format. This allows them to make corrections to the BPC member tables and reload the data without having to request any ETL be rerun.


A possible future enhancement would be to use this output file to rerun the ETL conversions (with an appropriate load identifier to differentiate it from incremental runs). This would greatly reduce the ETL run time and give users a quick turn around.


Data Services post script

The post script provides 3 important functions. It updates the job logging tables with a success message. Email is also sent to identified parties for the job. And the BPC log tables are updated. This last item allows the SSIS script to see that the files are ready for processing and links the logs by the BPC sequence number.


sql('ds_BPC',

   'UPDATE tblDTSLog '                            ||

   '  set Status = \'Completed\' '               ||

   'where Appset    = \'BBA\''                      ||

   '  and App = \'HS_FINANCE\''          ||

   '  and PackageID = \'ETL_Coda\''          ||

   '  and SEQ = ' || $int_HS_BPC_Seq);


Drill tables

Before we go into the SSIS portion of the presentation, I will cover one other function that Data Services can provide to BPC. In BPC records are stored strictly at the dimensional reporting level. To accommodate research and validation functions, drill through tables can be created which contain a lower level of detail (typically at the lowest grain available in the source system).


These tables are accessed through MS SQL Server stored procedures and require set up inside the application to provide the stored procedure inputs. The result set that is generated from the stored procedure is returned as an Excel spreadsheet for user manipulation. The columns in the result set will vary widely based on system availability and user requirements. The next slide provides a sample stored procedure.



A couple final thoughts on the Data Services integration section. Since SAP has moved more toward Data Services as their go to ETL / data movement tool there will hopefully be more fully featured and native support for it in future.


To briefly recap, we are using Data Services to:

  • Extract source system data
  • Perform business logic transformations as needed
  • Replicate conversion file functionality and maintain versioning
  • Augment logging into BPC Administration panel
  • Validate dimension members
  • Summarize and output files for processing in SSIS


SSIS Components


  • Clear BPC Fact table will ensure the data that are being imported in are clean and that any orphan records are zero out.
  • Before deleting records from BPC tables.  We need to archive these records into an archive table so we can use to rollback if needed.
  • There are three tables that are needed to be deleted.
    • dbo. tblFAC2 + name of model
    • dbo.tblFact + name of model
    • dbo.tblFACTWB + name of model
  • Since there are records that was imported manually by the user like using input template, so ou only want to delete records that were produced by the ETL extraction file.


Sample:

declare @PrevTIMEID int

declare @TIMEID int

declare @LoadDate datetime

declare @Model varchar(20)

set @Model = 'SFS_FIN'

set  @LoadDate = dateadd(m,-1,getdate())

set @PrevTIMEID = ((year(@LoadDate) * 100) + month(@LoadDate)) * 100

set  @LoadDate = getdate()

set @TIMEID = ((year(@LoadDate) * 100) + month(@LoadDate)) * 100

--Archive record into table

insert into dbo.TBL_ARCHIVE_BPC_FACT

(BBA_AUDITTRAIL, BBA_FLOW, BBA_RPTCURRENCY, BBA_SCENARIO, TIMEID, SIGNEDDATA, [SOURCE],

BBA_DEPARTMENT, BBA_ENTITY, BBA_INTERCO, ACCOUNT, FACT_TBL, BBA_MODEL,CREATE_DT)

select BBA_AUDITTRAIL, BBA_FLOW, BBA_RPTCURRENCY, BBA_SCENARIO, TIMEID, SIGNEDDATA, [SOURCE],

BBA_DEPARTMENT, BBA_ENTITY, BBA_INTERCO, SFS_ACCOUNT as ACCOUNT, 'tblFAC2' as FACT_TBL, @Model as BBA_MODEL, getdate()

from [BBA].[dbo].[tblFAC2SFS_FINANCE]

where TIMEID between @PrevTIMEID and @TIMEID

and [BBA_AUDITTRAIL] in (select AUDITTRAIL from dbo.TBL_BBA_MODEL_AUDITTRAIL where BBA_MODEL = @Model)

--delete

delete [BBA].[dbo].[tblFAC2SFS_FINANCE]

where TIMEID between @PrevTIMEID and @TIMEID

and [BBA_AUDITTRAIL] in (select AUDITTRAIL from dbo.TBL_BBA_MODEL_AUDITTRAIL where BBA_MODEL = @Model)


File import



Setting properties for BPC Task Dumpload

  • Data Management – Import into Cube
  • BPC Environment, model and BPC user that has administrator access.
  • File:  The extract file that was produces by your Data Service job.
  • Do not select Clear Existing Data.  This will clear everything in the FAC table and you do not want to do this.  This has be accomplished in our previous task.Ÿ

Failure

  • RollBack deletion of FAC tables
  • Set Log table update to “Error”
  • Force error return by doing a 1/0 so your scheduler can handle the error as needed by the business.

Constraint

  • No logging or error warning.
  • Fire and release method.

Optimize the BPC cube




Setting properties

  • Select the type of task – Optimized model
  • Define the task – enter the Environment and BPC User that has administrator access.
  • Select the model you want to optimize
  • Select Defragment the index of fact tables – this will improve BPC performance.
  • Compress database will also improve BPC performance.
  • Optimize method

   Select Lite, both other options will take BPC offline during the optimize process


Failure

  • RollBack deletion of FAC tables
  • Set Log table update to “Error”
  • Force error return by doing a 1/0 so your scheduler can handle the error as needed by the business.

Process BPC Cube



Setting properties

  • Select the type of task – Process model
  • Define the task – enter the Environment and BPC User that has administrator access.
  • Select the model you want to optimize
  • Process method

    Don’t select Full – this option will take BPC offline during the process

Failure

  • Set Log table update to “Error”
  • Force error return by doing a 1/0 so your scheduler can handle the error as needed by the business.

Log table updates



Set log table to Completed.

Sample:

UPDATE bba.dbo.tbldtslog
SET    status = 'Completed'
WHERE  app = 'SFS_FINANCE'
AND packageid = 'Import'
AND seq = (SELECT Isnull(Max(seq), -1)
FROM   bba.dbo.tbldtslog
WHERE  appset = 'BBA'
AND packageid = 'Import'
AND status = 'Running'
AND ownerid = 'BBAAVIATION\SAP_TestService')
AND ownerid = 'BBAAVIATION\SAP_TestService'

Force error return by doing a 1/0 so your scheduler can handle the error as needed by the business

UPDATE bba.dbo.tbldtslog
SET    status = 'Error'
WHERE  app = 'SFS_FINANCE'
AND packageid = 'Import'
AND seq = (SELECT Isnull(Max(seq), -1)
FROM   bba.dbo.tbldtslog
WHERE  appset = 'BBA'
AND packageid = 'Import'
AND status = 'Running'
AND ownerid = 'BBAAVIATION\SAP_TestService')
AND ownerid = 'BBAAVIATION\SAP_TestService'

--Force Error
SELECT 1 / 0

Once the Process Cube action has completed, you are done.  If there are additional business requirements you can handle them with either more SSIS Tasks or by utilizing whatever scheduling option you have chosen to call other processes outside of SSIS.

For follow-up questions, contact Ernie Phelps (Ernie.Phelps@decisionfirst.com) or  Son Ha, BBA Aviation (Son.Ha@us.bbaaviation.com).