Skip to Content

BODS Scheduler does not have out of box option to do Event based scheduling. If one Job has to be triggered after successful running of another one, it is done by Custom scripts or third party schedulers.  In this post, we will see how we can achieve the same with BOE Scheduler.

Configuring the BOE Scheduler

The CMS has to be linked to the Data Services so that the Jobs can be created on BOE Server. Login to Data Services Console and navigate to the CMS connection under Management section.

Expand the user account credentials for executing the program and enter the server credentials of the Service account of the Data Services.  Click on Apply to save.  Now this credentials will be used to run the program.

Scheduling the Job

The Job will be scheduled in BOE Scheduler instead of the Data Services Scheduler. Navigate to the Repository where the Job to be scheduled is from the Data Services Console.

Click on the Add Schedule from the Batch Job Configuration. In the Schedule page select the BOE Scheduler instead of the Data Services Scheduler and fill in the schedule time and remaining settings.

Do the schedule for all the Event dependent Jobs as well in the similar method stated above.

Configuring the Event

The Jobs can be found in the Data Services folder in the IPS of the Data Services. Login to the CMC and you will the folder in the root

The Jobs would be exported along with the necessary configurations to run them.  Those configurations are in the txt file

Let’s create the Event.  Navigate to events from the CMC and create a Schedule System Event for the Job. This event would be used to trigger when the Job runs successfully.

On the Properties of the dependent Job (Job that triggers the Event), Under the Schedule section, navigate to the Events and add the created Event into the Events to be triggered on completion.

Modify the Schedule and Save it. Similarly for the sequential job add the same event as the Event to wait for and modify the Schedule.

Your Schedule now runs within BOE and whenever the first Job runs successfully, it triggers the Event which causes the second job to run. You can add additional triggers and build the flow. You could also setup Email Notifications and monitor the Job history.

Note:

The Service account running the Data Services should have access to running the scripts and service on the Server. If the Data Services is deployed in Linux, then the service account should have rsh (remote login execution) setup.

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply