Flat File Distribution- Extracting data from single source based on our criteria loading data into multiple flat files (or) Distributing data into multiple flat files.

Scenario : Here am taking the source as SQL table which contain multiple States data. Saving different type of States data into different Flat Files.

Sample Source Data Preview :

Here in the above screen we extracting data based on State Column and distributing/generating data into multiple flat files based on different states

For Example :

Create Project :

Log in to SAP Data Services Designer.
On Navigation Bar –> Click on Project –> New and select Project or you can use shortcut key Ctrl+N.

Right click on Project Area. Create a Batch Job.

Create Global Variables.

Select your Job.

On Navigation Bar select Tools –> Variables

Right Click Global Variables select Insert and create the Global Variables as displayed in the below screen.

Create Workflow.

Select the Job and Drag and Drop the Workflow from right side palette/pane.

Double click on the Workflow and Create a Dataflow to load all the state names into one table with New Output Column to generate row number by calling gen_row_num( ) function in the Query Transform.

Double Click on the Dataflow. Map the Source(North_Wood_State.dbo) to Query Transform and Query Transform to a Template Table(State_List).

As shown in the below screen.

Double click on Query Transform map States column to Output. Create a New Output Column and name it as a Row_Count with data type int.

Call function gen_row_num( ) to the new output column. As show in the below screen.

Select GROUP BY tab in the Query Transform. Drag State Column from source to Group By Workspace.

To retrieve only unique state names for the source.

Select Workflow from the Project Area Pane and create New Script after Dataflow as shown in the screen.

Open Script by double clicking the script and write the script as shown below.


$G_TOTAL_COUNT = cast( sql(‘SQL_DB’,’select count(1) from State_List’),’int’);

          Here SQL_DB is the Datastore Name.
          Using Cast function to convert varchar to int data type.

Validate the Script. Select Workflow from the Project Area Pane and create a New Whileloop after Script.

Double click on the Whileloop.

Write the below statement as show below. Create a new Script Whileloop workspace area as show in the below figure.


Double click on the Script and write script as below shown

$G_STATE = sql( ‘SQL_DB’,’Select State from State_List where Row_Count ={$G_INITIAL_COUNT}’);

$G_FILENAME = $G_STATE ||’.txt’;

print(‘Processing record for North_Wood_State of state ‘|| $G_STATE);

print( ‘Generating file ‘|| $G_FILENAME);


print( ‘Initaial Count Increased to {$G_INITIAL_COUNT}’);

print( ‘Total Count is {$G_TOTAL_COUNT}’);

          Here in the above script we are going to assign State name from that template table. Which we created in the dataflow earlier

          and the file name and format by concatenating it with global variable.

          Print statement included to display File name in the Log while executing the Job. Increasing the Initial Count with 1.

          Printing the increased count to Log to verify the count while execution.

Select Whileloop from Project Area Pane. Create New Dataflow after Script as shown in the below figure.


Double click on the Dataflow. Map Source to Query Transform.

13open df.PNG

Open Query and Map all fields to Schema Out. In where clause write the below statement and displayed in the below screen.

(North_Wood_State.State is not null and
North_Wood_State.State = $G_STATE)

13Inside DF Query where clause.png

After mapping all the columns to schema out. Right click on the query schema and select Create File Format.

14Inside DF Query.png

Provide input fields in the file format editor as highlighted below. Browse the path of Root Directory path to target files. Assign file name as $G_FILENAME.

15Create File Format.PNG

Save the file format. Drag and drop the flat file to the data flow area. Map Query Transform to Flat File.

Execute the job .

Make sure when you are executing the job next time please make the first dataflow table as Import Table and make sure you enable

delete data before loading to table in Options Tab.

Trace as follow:





To report this post you need to login first.


You must be Logged on to comment or reply to a post.

Leave a Reply