Skip to Content
Technical Articles

ABAP DATA FLOW IN SAP BODS

ABAP DATA FLOW 

 

This blog post explains step by step procedure how to implement ABAP data flow in SAP BODS.

The most useful method for extracting huge data is use to ABAP data flow.

I can give example how to develop and what is the main cause to use the ABAP data flow in this blog.

First we need to create SAP data store in SAP BODS because of we can import data easily from source by giving connection.

Create Data store between SAP source and BODS

 

 

A Pop-up for create a new data store will appear as shown in below screenshot.

 

 

  • Enter Data store name DS_SAP_TEST.
  • Data store Type name as SAP Applications Source.
  • Enter Database server name, User Name, Pass word.
  • Click on Apply button after that click on OK button.

Data store will be created and created Data store will be shown in below.

  • Go to Local Object Library then select desire Data store tab
  • Data store DS_SAP_TEST will be displayed as shown in below.

 

 

We need to add the ECC table to the SAP data store before we can continue.

  1. Click on the SAP data store in the Local Object Library
  2. Click the Search button at the top of the window
  3. Enter ECC in the Name field
  4. Click on enter button
  5. Right click the ECC table in the search results and click on Import button

Create a JOB, Work flow and Data flow and also follow the naming conventions as SAP BODS standards. JOB Name JOB_ABAP_TEST, Work flow name as WF_ABAP_TEST and then create a Data flow name as DF_ABAP_TEST

Then click on the object (ABAP data flow).  After that

  • Enter Data store name DS_SAP.
  • After that Generated ABAP file name ZTEST
  • ABAP Program name ZTEST.
  • Job name ABAP_ZTEST
  • After that enter the details ABAP row limit, Join rank, parallel process threads and RFC data transfer package size as shown in below screen shot.
  • And then click on Apply button.

 

 

After that click on the ABAP data flow and select the SAP tables which we want and next we will take the query transformation and here we select all the columns from the schema in of the query transformation and map to output to the schema out after that we will connect that query transformation into data transport object

 

Transport Object:

 

A data transport is a step in an ABAP data flow that defines a target in which to store the data set extracted during the flow. A data transport defines a staging file for data extracted from SAP applications.

After that drag ABAP data flow and then place query transformation with rename as QRY_EXTR and here we select all the columns from the schema in of the query transformation and map to output to the schema out.

 

After that Save and validate the project. And then click on validate button a pop-up will appear for validation success.

After that to execute job. Follow the below steps.                                                                                Select job and right click choose execute option, to execute job.

Conclusion:

By this way you can easily how to implement ABAP data flow in SAP BODS. This method for extracting huge data is easy.

 

4 Comments
You must be Logged on to comment or reply to a post.
  • Note that the best possible extraction method is not necessarily related to data volume, but rather determined by functionality.

    1. When extracting from a single table, use a regular data flow.
    2. When extracting from a join of tables, use an ABAP data flow. Although where-conditions are pushed down to the underlying system from a regular data flow, the joins are not, often leading to poor performance.
  • @Nagasekhar K

    In my view, it was important to show how to setup the data transport during this whole process as data transport is not setup by default in abap dataflow and new user will get error if try to follow this article since it is showcasing step by step process to crate abap dataflow.

    Blacking out the table names is not a great idea while writing the articles. I am sure such names wont be sensitive information which you can not share to the sap world. Just a suggestion, more clean approach we use, it is better.

    Hope this helps.

  • Agree with Mr. Ansari.

    I too expect a bit more deeper understanding on this topic like setting up more information at Datastore level like ABAP execution level setting,RFC directory etc.

    Also, if we are pulling less/more columns from SAP tables, how the ABAP Dataflow will behave? How an Abaper can help us in getting proper authorization for this Generated ABAP file/program?