Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member

AIM:- The aim of this document  is to illustrate data migration steps that involves BODS as an ETL tool to extract data from legacy systems, able to profile data, transform ,validate and cleanse data before loading to SAP systems.

WHY BODS:- Over quite period of time after SAP acquired Business Objects, there has been growing demand for Business Objects Data Services aka BODS or simply SAP DS as data migration tool.

The ease of development and maintenance of the code in terms of easy to understand graphical flows via workflows/dataflows help customers to switch from SAP LSMW(Legacy Systems Migration Workbench) to BODS with minimum training and efforts.

HOW TO START:- Since the target migration systems is SAP, it is imperative to come up with standard templates of jobs that would deal with all SAP modules like FI,CO,MM,SD,PP,PM etc.

There are best practices BPFDM(Best Practices For Data Migration) under AIO(All In One) umbrella that encapsulates all standard input data migration input files, jobs that can be installed and configured in BODS repository.

WHAT ARE THESE BPDM:- The following files can be downloaded from SAP market place.

Migration_CRM

Migration_DQ

Migration_ERP

Migration_HCM

Migration_Retail

Each folder contains list of standard jobs ,look up files, input standard files, data stores connection necessary to deploy in BODS environment.

HOW TO DO:-

   You first need to have local repository created preferably clean one to start with.

1)      Log in to BODS designer.

2)     

RI       Right click on blank space in Local Object Library Area and chose to import the file from the folder you have downloaded from market place.

    You will get the message for the import. Click on ok

3)      After import is completed, you will see set of jobs.

These are the standard AIO jobs for migration to SAP ECC pertaining to finance, logistics execution and other ERP modules.

Each job is identical except the business content varies in terms of loading specific SAP data pertaining to specific module.

The jobs were developed after extensive industry best standard approach and interaction with customers which include technical and functional/business users having extremely good experience with each business module.

Job details and customization points:-

1)      The job begins with Script object for initializing global and local variables that are used same across all jobs except that local variables could be different for each business module.

2)      The subsequent conditional  flows depicts on generating data in staging tables for each segments that are used in specific IDoc and finally end with generating/creating IDoc.

You can chose to include or exclude certain workflows by setting the variable associated with workflows during run time.

Each workflow under conditional object is the vital point where we you need to concentrate on the mappings, look-ups, validation, enrichment and ensure that they are as per the requirements.

This is where customization plays a pivot role and would need several rounds of reviews and testing before you actually go for loading data at SAP systems.

3)      The last step is generate Idoc section where all your mappings from legacy fields are mapped to relevant segments of IDoc specific to SAP business module.

This section is sometimes not used by some customers as they take help of SAP load programs like LSMW, BDC, ABAP programs etc.

Job flow screenshot:-

1)      SCR Initialize:-This script initializes all possible and relevant global and local variables that are used to pass the values to the subsequent output fields mappings

During customization at certain project/client landscape, you can include more variables in the script objects and initialize them in order to maintain flexibility and enforce re-usability.

2)      In order to map legacy fields to IDoc segments, you need to create staging tables with same structure before that.

It is important first to understand the target IDoc structure carefully.

  1. e.g. Customer Master IDOC(DEBMAS) has one segment called E1KNA1M(Master customer basic data) and several sub segments like E1KNA1H(header),E1KNVVM(sales data),E1KNB1M(company code data),E1KNBKM(bank details) etc.

There are several fields beneath each parent and child segments.

So, it is important to go for each required segment and then go to that conditional flow like CustomerMasterBasicData_E1KNA1M_Required.

Navigate through this flow first as shown below:-

This is very important workspace where you need to do customization as per the requirements.

The mapping specifications provided by business has to be applied here carefully with much functional/business engagements.

The  object “Replace with sourceFile or Table” has to be replaced with your input/legacy files provided by business.

Either you can use Excel file or text file as source as per the project guielines.

Accordingly, you need to create Excel or text file format in BODS with required fields and then use it as input in the below dataflow workspace.

In the Query transform, you need to map all required fields from your input file, create additional derived columns if required and load to staging/template tables.

After the data is loaded to map staging table, it is time to validate the records and load to valid staging table. The data that fails the validation criteria are loaded to invalid staging table that shows the records rejected with reason field as well.

Accordingly, the failed records can be analyzed with business/functional counterparts and re-processed after making necessary configuration changes at SAP system or modifying the proper mappings at BODS level.

There are some mandatory fields at IDoc segments that need some data and not NULL coming from source. Hence you can look at ‘Validate Mandatory Columns’ and add more if required.

The ‘’Validate Lookups” is the most frequently visited place where you actually analyze the look-ups tables used for validation.

The look-up table had to be mostly analyzed and modified after several engagements with business/functional counterparts.

It needs to be always ensured to be proper without any blames on each other.

After having loaded to valid staging table, it is time to further enrich data and fit them to be able to meet SAP standards by looking at the look-up table like MGMT_ALLOCATION table and other LKP tables.

The lookup_ext function is used to serve this purpose where you can specify multiple fields conditions.

You can add more enhancements as per the business requirements here.

The data after all enrichments are loaded to enriched staging table

After having loaded to enriched staging table,it is time to re-visit the Idoc structure for the segment for which the final enriched staging table is loaded.

Similarly, you need to repeat this step for each conditional workflow.

The last step in each conditional segment worklow is generate IDoc where you map each enriched table to each segment in IDoc.

The above screenshot is quite self-explanatory what you need to do.

There is quite frequent actions like ‘make current’ and then you map the input field to output field to ensure you make correct mappings

The below screenshots show the nested structure that you use BODS to imitate for IDoc.

The major part of BODS lies in profiling the source data to understand the max,min, nulls of the source data, perform easy mappings based one easy to use GUI provided by switching back and forth on the same screen. Switching on and off the validations starting from BODS 4 version is a good feature.

The data validation under Data Services Management Console helps you to analyze the data validation statistics.

This document just shows up the basic preparation for any data migration/conversion step in RICEF of SAP implementation specific to BODS.

I will add more documents like this in future to help us.

9 Comments
Labels in this area