Skip to Content


The Scenario is, before the Dataflow executes the ETL operation, we need to move the flat files from ftp folder to system folder, since the source for the dataflow is pointed to the system source folder. Then after the data load is completed, the same files should be moved to the Archive Folder.

Script Explanation:

To achieve the above requirement, custom functions can be used.

The Function to be used is as below:

“Return exec( ‘cmd’, ‘move /Y ‘ || $PI_SourceFile || ‘ ‘ || $PI_TargetDirectory, 8);”

(Note:-Move is a Dos (OS) Command for moving a file from one folder to other0.

Name this Custom function as “file_move” in Data Services.

Below are the Steps of Execution:-

1)   Declare the Path of the Directories in the System Parameter Configurations or in Variables.

2)   Then you can call this function using a Script editor in your Workflow.

Eg:    Pre_Process_Script ——–> DataFlow——>Post_Process_Script


1) Finally make a call to the function in the Script Editor.

For e.g.:- To move a file from FTP Folder to source folder, you can call the “file_move” function in Pre_Process Script Editor as shown below

if(file_exists(‘[$$ManlFtpPath]\\’ || $L_SourceFileName) = 1)

f_MoveFile(‘[$$ManlFtpPath]\\’ || $L_SourceFileName, ‘[$$ManlSourcePath]’);

end    “



The same way after executing the job, for moving the file from Source Folder to Archive Folder, we can call the same function in Post_Process Script Editor as below

f_MoveFile(‘[$$ManlSourcePath]\\’ || $L_SourceFileName, ‘[$$ManlArchivePath]’);

To report this post you need to login first.


You must be Logged on to comment or reply to a post.

Leave a Reply