Skip to Content

Introduction:

This Article provides various solutions for scheduling multiple BODS Batch Jobs (Jobs) sequentially and conditionally. As you are aware that BODS does not contain an inbuilt mechanism to chain multiple Jobs within one parent Job and the default way to work is to chain multiple Workflow’s within a Job. However we cannot execute workflow on its own and need a Job to execute it and in various scenarios there is a need to sequence the Jobs and conditionally run the Jobs. The approaches provided below can be used where chaining multiple Workflow’s within a Job is not enough and Jobs have to be chained/Sequenced.

The advantages of using below approaches are

1.There is no need for a Third Party Scheduling Tool, Various features within BODS can be combined to create a Job, which acts as a Parent Job, which can be scheduled to trigger Jobs one after other. Parent Job acts like a Sequencer of Jobs.

2. We can avoid scheduling each and every Job and only Schedule Parent Jobs.

3. Using WebServices approach, Global Variables can be passed Via XML file to the Jobs in a simplified manner.

4. Using WebServices approach, Developer would need access only to the folder, which JobServer can access, to place the XML files and does not require access to the JobServer itself.

5. Avoids loading a Job with too many WorkFlows.

6. Time based Scheduling (example:Schedule Jobs at 10 every minutes Interval) can be avoided and Hence there will not any overlap if the preceding Job takes more than 10 minutes.

7.As the Child Jobs and the Parent Job will have its own Trace Logs it would make it easier to troubleshoot in case of any issues.

8.At any point, Child Jobs can be run independently too in Production Environment, this will not be possible if the entire Job logic is put into a WorkFlow.

Scheduling BODS Jobs Sequentially:

If the requirement is to just sequence the jobs so that it can be executed one after the other irrespective of whether the preceding job completes successfully or terminates with some error, then, one of the below approaches can be used. Note that in the example provided below it is considered that the jobs do not have any Global Variables. Approach for Chaining/Sequencing Jobs with Global Variables is explained in the later part of the Article.

Sequencing using Script:

Consider two simple Jobs: Job1 and Job2 are to be executed in sequence and Job2 does not have any Business dependency on Job1. The requirement is to execute only one Job at a time. i.e Job1 can be run first and then Job2 or the other way round but the only criteria is that no two jobs should run at the same time. This restriction could be for various reasons like  efficient utilization of Job Server or because both the Jobs use the same Temp Tables.

Steps to Sequence Jobs using Script:

1.Export the Jobs as .bat (Windows) using the Export Execution Command from Management console.

Export Execution Command.jpg

2.Check availability of Job1.bat and Job2.bat files in the Job Server.

3.Create a new Parent Job (call it as Schedule_Jobs) with just one Script Object.

4. In the Script, Call the Job1 and Job2 one after another using the exec function as given below

Print(‘Trigger Job1’);

Print(exec(‘C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\log\Job1.bat’,”,8));

Print(‘Trigger Job2’);

Print(exec(‘C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\log\Job2.bat’,”,8));

When the Schedule_Jobs Parent Job is run it triggers Job1 and then after completion (successful completion/Termination) of Job1 it triggers Job2. Now Parent Job can be Scheduled in the Management Console to run at a Scheduled time and it would trigger both Job1 and Job2 in sequence as required. Note that if the Job1 hangs due to some reason, Schedule_Job will wait until Job1 comes out of the hung state and returns control to Schedule_Job. In this way any number of Jobs can be sequenced.

Sequencing using Webservices:

If the same above two jobs (Job1 and Job2) have to be executed in sequence using Webservices, below approach can be used.

1.Publish both Job1 and Job2 as Webservice from Management Console.

Publishaswebservice.jpg

2.Pick up the Webservice URL using the view WSDL option, the link will be as given below

               http://<hostname>:28080/DataServices/servlet/webservices?ver=2.1&wsdlxml

ViewWsdl.jpg    

3.In Designer, create a new DataStore with Datastore type as WebService and provide the WebService URL fetched from View WSDL option.

WebserviceDatastore.jpg

4. Import the published Jobs as functions

/wp-content/uploads/2012/12/importfunctions_169098.jpg

5. Create a simple Parent Job (called Simple_Schedule) to trigger Job1 and Job2

Simple Job.jpg

6. In the Call_Job1 query object, call Job1 as shown in below diagrams, as no inputs are required for Job1, the DI_ROW_ID from Row_Generation or Null can be passed on to the Job1.

Call1.jpg

/wp-content/uploads/2012/12/call2_169101.jpg

/wp-content/uploads/2012/12/call3_169102.jpg

/wp-content/uploads/2012/12/call4_169103.jpg

7. Similarly call Job2 in the Call_Job2 query object.

When the Simple_Schedule Parent Job is run, It triggers Job1 and then after completion (successful completion/Termination) of Job1 it triggers Job2. Now the Parent Job can be Scheduled in the Management Console to run at a Scheduled time and it would trigger both Job1 and Job2 in sequence as required. Note that if the Job1 hangs due to some reason, Parent Job will wait until Job1 comes out of the hung state and returns control to Parent Job.  In this way any number of Jobs can be sequenced.

Scheduling BODS Jobs Conditionally:

In most of the cases, Jobs are dependent on other Jobs and some Jobs should only be run, after all the Jobs that this Job depends on, has run successfully. In these scenarios Jobs should be scheduled to run conditionally.

Conditional Execution using Script:

Lets consider that Job2 should be triggered after successful completion (not termination) of Job1 and Job2 should not be triggered if Job1 fails.

Job status can be obtained from Repository table/view ALVW_HISTORY. The Job status for the latest instance of the Job1 run should be checked and based on that Job2 should be triggered.To do this,

1.The Repository Database\Schema should be created as new Datastore (Call it BODS_REPO).

2.Import the ALVW_HISTORY view from the Datastore.

3.Create a new Parent Job Conditionally_Schedule_Using_Script with just one Script Object

4.Create two Variables $JobStatus and $MaxTimestamp in the Parent job

5.Between the exec functions place the status check code as given in the below code

Print(‘Trigger Job1’);

Print(exec(‘C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\log\Job1.bat’,”,8));

#Remain idle for 2 secs so that Job Status is Stable (Status moves from S to D for a Successful Job and E for Error)

Sleep(2000);

#Pick up the latest job Start time

$MaxTimestamp =  sql(‘BODS_REPO’, ‘SELECT  MAX(START_TIME) FROM DataServices.alvw_history WHERE SERVICE=\’Job1\’;’);

PRINT($MaxTimestamp);

#Check the latest status of the preceding job

$JobStatus = sql(‘BODS_REPO’, ‘SELECT STATUS FROM DataServices.alvw_history WHERE SERVICE=\’Job1\’ AND START_TIME=\'[$MaxTimestamp]\’;’);

PRINT($JobStatus);

if ($JobStatus=’E’)

begin

PRINT(‘First Job Failed’);

raise_exception(‘First Job Failed’);

end

else

begin

print(‘First Job Success, Second Job will be Triggered’);

end

Print(‘Trigger Job2’);

Print(exec(‘C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\log\Job2.bat’,”,8));

Using the above code in the Script, when the Parent Job is Run it will trigger Job1 and only if Job1 has completed successfully it will trigger Job2. If Job1 fails then Parent Job will be terminated using the raise_exception function. This approach can be used to conditionally schedule any number of Jobs.

Conditional Execution using Webservices:

To conditionally execute Job (Published as WebService) based on the status of preceding Job (again Published as WebService), the same concept used in the Conditional Execution using Script can be applied. i.e Call Job1, Check the Status of Job1 and then if Job1 is successful trigger Job2.

1.Create a Parent Job with 2 DataFlows and a Script in between the DataFlows

2. Use First DataFlow to call the First Job (Refer above section for detail on calling a Job as webservice within another Job)

3. Use the Second DataFlow to call the Second Job

4. Use the Script to Check the Status of First Job

contionalWS.jpg

The Script will have below code to check the status

#wait for 2 seconds

sleep(2000); 

#Pick up the latest job start time

$MaxTimestamp =  sql(‘BODS_REPO’, ‘SELECT  MAX(START_TIME) FROM DataServices.alvw_history WHERE SERVICE=\’Job1\’;’);

PRINT($MaxTimestamp);

#Check the latest status of the Preceding job

$JobStatus = sql(‘BODS_REPO’, ‘SELECT STATUS FROM DataServices.alvw_history WHERE SERVICE=\’Job1\’ AND START_TIME=\'[$MaxTimestamp]\’;’);

PRINT($JobStatus);

if ($JobStatus=’E’)

begin

PRINT(‘First Job Failed’);

raise_exception(‘First Job Failed’);

end

else

begin

print(‘First Job Success, Second Job will be Triggered’);

end

Using the above code in the Script when the Parent Job is Run it will trigger Job1 and only if Job1 has completed successfully it will trigger Job2. This approach can be used to conditionally schedule any number of Jobs that are published as WebService.

Conditional Execution using Webservices: Jobs with Global Variables

When Jobs have Global Variables for which values needs to be passed while triggering it,It needs to be handled differently as when the Job is called as webservice it expects Global Variables to be mapped. So the idea is to pass either Null Values (For Scheduled Run) or actual Values (For Manual Trigger) using XML file as input.

Lets assume that the First Job has 2 Global Variables like $GV1Path and $GV2Filename and that the second Job does not have any Global Variables and the requirement is to trigger Job2 immediately after successful completion of Job1.

1.Similar to above Parent Job, Create a Parent Job with 2 DataFlows and a Script in between the DataFlows

contionalWS.jpg

2. Use First DataFlow to call the First Job (Refer above sections for detail on calling a Job as webservice within another Job), Instead of Using Row Generator Object use XML input File as source

FirstJob.jpg

The XSD for the Input XML file will be as given below, if there are more Global Variables in the Job then elements GV3, GV4 and so on should be added to the Schema.

<xs:schema attributeFormDefault=”unqualified” elementFormDefault=”qualified” xmlns:xs=”http://www.w3.org/2001/XMLSchema“>

  <xs:element name=”FIRSTJOB”>

    <xs:complexType>

      <xs:sequence>

        <xs:element name=”GLOBALVARIABLES”>

          <xs:complexType>

            <xs:sequence>

              <xs:element type=”xs:string” name=”GV1″/>

              <xs:element type=”xs:string” name=”GV2″/>

            </xs:sequence>

          </xs:complexType>

        </xs:element>

      </xs:sequence>

    </xs:complexType>

  </xs:element>

</xs:schema>

          The Input XML file used is as given below

<FIRSTJOB>

<GLOBALVARIABLES>

<GV1>testpath1</GV1>

<GV2>testfilename</GV2>

</GLOBALVARIABLES>

</FIRSTJOB>

3. In the “WebService function Call” in “call_FirstJob” Query object, map the Global Variables as shown below

/wp-content/uploads/2012/12/mapglobalvariables_169147.jpg

4.Use Second DataFlow to call the Second Job. As this Job does not contain Global Variables, Row Generation object would be enough (as in previous      section)

5. Use the Script object to Check the Status of First Job

Using the above approach, When the Parent Job is Run it will trigger First Job and pass the Global Variables present in the Input XML File and only if First Job has completed successfully it will trigger Second Job. This approach can be used to conditionally schedule any number of Jobs that are published as WebService. For every Job that has Global Variables, An XSD and XML file should be created. The Global Variables passed from the XML file to the WebService seems to be working only when the parameters are passed in right order, Hence it would be good practice to name the Global Variables with good naming convention like $GV1<name>, $GV2<name> and so on.

About me:

Anoop kumar V K, Technology Lead at Infosys Limited.

To report this post you need to login first.

26 Comments

You must be Logged on to comment or reply to a post.

  1. Harvin DSOuza

    Hi Anoop,

    The document is very illustrative, but I cannot execute 2 BODS jobs in the 1st script option you have suggested.

    I do not get any errors, but the MONITOR adjacent to TRACE and ERROR tabs does not reflect the record creation.

    I have executed the individual jobs and they result data.

    I have also tried to execute the batch jobs independently, the CMD window opens for a sec, executes and closes. but there is no change in the target table.

    Could you please help me out?

    Regards,

    Harvin

    (0) 
  2. abhi a

    Hi Anoop,

    Very Helpful blog, very easy to understand.

    Am greatful to you for providing this blog.

    Thanks

    Abhi

    (0) 
  3. Rakesh Samuel

    Hey Anoop,

    This article is really good one. This actually will help somany situations that we face in project. Thanks again for this article.

    Rakesh

    (0) 
  4. Andrés DElia

    We’ve been using a similar approach with a custom made function which launch jobs via al_joblauncher, instead of creating .bat files that could be cumbersome to administer. So far its proved to be a very reliable solution. The trick was to disassembled the .bat call and parametrize all that we could.

    Our main reason for doing this (against certain recommendations) is that bigger Jobs takes longer to compile and are more prone to corruption issues.

    (0) 
  5. abilash n

    Hi Anoop,

    Excellent blog with easy to understand flow in picturial format. I am working in ECC,BI. I just want to know if BODS wants data from ECC what is the procedure and steps to follow for that. what are the programs or Function modules useful. If you can reply to this post a new blog that will be amazing.

    (0) 
  6. Prerna Rathore

    Hi Anoop,

    While Scheduling Jobs using Webservices can you please describe the tep where we have to take input for the Query Transfor (Call_Job2) what should be the input for the transform.?

    And what will be the schema for the XML target.?

    (0) 
    1. Venkata Ramana Paidi

      Hi Prerna,

      For XML target  you have to use the template XML . It acts like template table and take the structure of the output schema.

      Here for the two data flows  source is  row generation transform only .  In the first data flow (Call_FirstJob) we are calling the first job . Then we are using the script(FirstJobStatus) to decide  second job will run or not. In the script if first job status is fail means we are raising the exception else we are executing the second job.

      Here everything is controlling with in the script only . We can tell second data flow replica of first data flow except  job parameters.

      (0) 
      1. SHAIK ABDUL RASHEED

        Hello Venkat/All,

        Have tried doing this using WS but getting attached error. Can you please let me know if i m doing any mistake in here. in first query trans, i m calling Row generation but still getting this error. Also when i call either of this ROW GEN/First Query Trans in second Query Trans, it’s unable to find the matching cretiria. Please advise.

        Thanks,

        AbdulrasheedScheduling_Job_Error.JPG

        (0) 
  7. Karan Nakra

    Hi Anoop

    I have been applying the same method to a scenario where I have two jobs under a parent job and I want only one of them to be executed on the basis of a parameter value.
    I am using the webservices to execute the parent job.
    But I am facing some issues in the XML file or may be in its schema.
    So could you please elaborate the XML part?

    (0) 
  8. Sameer Hussain

    Hi Anoop,

    Thanks for sharing this informative document.

    But I have a quest here, In our requirement we are extracting a data from BW (Source), and we have the jobs which ran hourly basis, but sometimes due to huge volume the first job will take more time than expected, and the second job which ran on schedule basis will proceed further by failing the first job. So we came with a solution by maintaining some script logic at designer level, that first job need to proceed as it is but by failing the second job, and its in progress.

    But apart from that, we are looking for some other new solution which will hint the Source system directly before the trigger sends to BODS like eventing or script at the backend.

    So can you please throw some light on this..

    Thanks,

    Sameer

    (0) 
  9. Suri Mareddy

    Like the approach, but its hard to address recovery using this method.

    Say job2 fails, I’d like to fix and restart Job2 and expect the dependency to flow through – Job3 would follow the successful completion of Job2.

    To address this, I am trying to use something like trigger files using the built-in function wait_for_file().

    Will post more on what I find out..

    (0) 
  10. Achille Masserano

    Hi Anoop,

    Great document !!!

    I’m just not sure the dataflow that call Job1 by WS will wait until its execution ends, I can be wrong but I think it will start the job and then move forward to the next step without wait Job1 end.

    In that case a wait loop should be included between the two dataflow. Inside the loop you will need to call another WS function to check Job1 status and loop until it ends.

    Does this make sense ?

    (0) 
  11. Douglas Ricardo Correa

    Hi guys…
    First of all thanks for your tutorial I really appreciate.

    I am facing a problem. When I run the main join a get a message: the input line is too long. it’s no a error message but my job finish without do anything.

    thanks in advance.

    (0) 

Leave a Reply