Here, I will try to explain the simplest way to execute SQL Stored Procedure from BODS and then continuously start data extraction.

Below the step by step to execute stored procedure and then start data extraction.

1. Create batch job and add Script and Workflow. Script will be used to execute stored procedure.

/wp-content/uploads/2014/03/1_417984.png

2. Function sql() will be used to execute SQL stored procedure, call this function in Script as like below

Syntax: sql (datastore, sql_command)

DataStore: A string containing the name of the datastore where the tables involved in the SQL operation reside. This name is the name you specified when you created the datastore in Data Services. Include this string in single quotation marks.

SQL_Command: A string containing the text of the SQL command to execute. This string must be enclosed in single quotation marks (‘). If the string contains quoted values, the internal quotation marks must be single quotation marks preceded by the escape character, backslash (\).

Example :

DataStore Name: SQL_FI_SRC

Stored Procedure Name: SP_Fill_FITable

/wp-content/uploads/2014/03/1_417984.png

If SQL Stored Procedure name as dynamic input, then

Create Global variable under Batch Job and use the same to pass SQL stored procedure name while executing the job, below the example

Global Variable:  $Exe_Statement

DataStore Name: SQL_FI_SRC

2.PNG

Create Global variable under Batch Job and use the same as stored procedure parameter and pass the value to variable while executing the job, below the example

Global Variable:  $CalMonth

DataStore Name: SQL_FI_SRC

Stored Procedure Name: SP_Fill_FITable

Required Parameter : @CalMonth , @Type

Assume @calmonth is a dynamic and @Type is default value then use the below script

3.PNG

Note: Since the script will be generating on specified datastore object, we cannot provide invalid datastore or variable asdatastore.

3. Above mentioned steps will execute the stored procedure and load the value into source table. Now the source table is ready for extraction.

4. Create dataflow to perform data extraction from source table and add under workflow as like below,

/wp-content/uploads/2014/03/1_417984.png

4. Execute the Batch Job, Script will execute the stored procedure and load the value to the source table after that dataflow  will perform the data extraction from source to target table .

/wp-content/uploads/2014/03/1_417984.png

Execution status will be available in “Job execution status Log”.

/wp-content/uploads/2014/03/1_417984.png

To report this post you need to login first.

10 Comments

You must be Logged on to comment or reply to a post.

        1. Michael Eaton

          No it is not a problem.  A datastore is a logical object, the actual database name and other environmental settings can be changed using datastore configurations or during code promotion.

          Michael

          (0) 
  1. Srini Tata

    Hello,

    Thank you for the useful write-up.

    I have a question – if I’ve a custom data type as an input parameter type for an Oracle PL/SQL Stored Procedure — is there anyway I can invoke that procedure from SAP Data Services (BODS)?

    Please let me know

    Thanks in advance for your kind response.

    (0) 

Leave a Reply