SAP Business Objects Data Services Workbench 4.1


  1. 1.1       DS Introduction

SAP BusinessObjects Data Services delivers a single enterprise-class solution for data integration, Data quality, data profiling, and text data processing that allows you to integrate, transform, improve, and deliver trusted data to critical business processes. It provides one development UI, metadata repository, data connectivity layer, run-time environment, and management console—enabling IT Organizations to lower total cost of ownership and accelerate time to value. With SAP BusinessObjects Data Services, IT organizations can maximize operational efficiency with a single solution to improve Data quality and gain access to heterogeneous sources and applications

  1. 1.2       Data Services Workbench

The Data Services Workbench is an added application which provides a graphical user interface (GUI) development environment that simplifies data application logic to migrate data and database schema information between different databases in a data warehousing environment.

  1. 1.3       Key features of Workbench
  • Browsing table metadata and data.
  • Selecting individual source tables or multiple tables for migration.
  • Specifying the order in which the source tables should be migrated.
  • Adjusting the table schema in detail. For example, adding or removing columns, defining constraints, partitions, indexes, and so on.
  • Specifying filters and simple projection expressions.
  • Specifying source and target table options such as array fetch size and bulk-loading Options.
  • Executing a replication job as an initial load or delta load

1.4     Drawbacks in Previous DS

  • In previous versions, for migrating data and schema information, we required to create many dataflows in the Designer, with each dataflow reading, from a single source table and writing to a template/permanent target tables.
  • In addition, incompatibilities between the source and target database types could require manual schema and data corrections.

1.5     Added Advantages in Workbench

  • The Data Services Workbench automates migration process. Instead of creating many dataflows manually, we can now provide connection information for the source and target databases and select the tables that we want to migrate.
  • The Workbench automatically creates Data Services jobs, workflows & Datastores and imports them into a Data Services repository.
  • We can execute and monitor the jobs status from within the Workbench.
  • Workbench supports advanced options such as bulk loading and delta loading
  • Jobs created in the Workbench can be scheduled with the Data Services Management Console, and the generated objects can also be used as a starting point for further editing within the DataServices Designer.

       (For example, we might require adding more advanced transformations that are not available directly in the Workbench)

  • Workbench supports migration from Data Services-supported databases and SAP applications to SAP HANA, Sybase IQ, Sybase ASE, Oracle, Microsoft SQL Server, DB2, and Teradata targets.

1.6 Steps to proceed with Workbench

  1. Launch the Workbench.

  (The Workbench can be accessed from the Windows Start Menu: All Programs > SAP BusinessObjects Data Services 4.1 > Data Services Workbench.)

  1. Enter your user credentials for the CMS.
  2. Provide the details:
    • SystemName/ServerName
    • UserName                    –           The user name to use to log into the CMS.
    • Password                     –           The password to use to log into the CMS.
    • Authentication mode      –           The authentication type used by the CMS.
  3. Click Connect.

/wp-content/uploads/2015/04/1_679694.gif

Workbench can perform 2 types of Migration tasks:

  1. Quick Replication Migration
  2. Detail Replication Migration

  1. 1.7       Quick Replication Migration

Upon selecting Start a Replication Wizard after login we can perform Quick Replication Migration Process.

/wp-content/uploads/2015/04/2_679695.gif

Below are the steps to be performed to achieve the results.

Provide –          Project Name


/wp-content/uploads/2015/04/3_679696.gif


Provide –         Source database details


/wp-content/uploads/2015/04/4_679697.gif


Selection –       Source database tables

/wp-content/uploads/2015/04/5_679698.gif


Provide –         Target database details


/wp-content/uploads/2015/04/6_679699.gif


Selection –       Execution Properties


/wp-content/uploads/2015/04/7_679710.gif


Final Status & Report


/wp-content/uploads/2015/04/8_679711.gif


1.8       Advantages & Usability of Quick Replication Process:

  • Useable when 1:1 Source to Target mapping & without any transformation involved.
  • N – No. of tables can be replicated at a single run.
  • Later point if any modifications like filters or conditions upon tables can be implemented.

1.9       Detail Replication Migration

Here over project level, we can go & change the properties alter/add the logic.

This is similar to the concept of QUERY Transform in the Designer, which allows us to apply some conditions or filters over data & most of the functions related to conversions, date, database, Aggregate, Lookup, Math functions & so on can be applied directly from the workbench.

Apart from this, Workbench has best feature like creating/deleting index over the table from the GUI itself.


/wp-content/uploads/2015/04/9_679712.gif


1.10    Functions Library


/wp-content/uploads/2015/04/10_679713.gif


1.11    Data Analysis Feature in workbench

Analysis over source or target table data can be directly performed over workbench.

Features available to perform Charts or tables.


/wp-content/uploads/2015/04/11_679714.gif


  1. 1.12    Delta Load Mechanism using Workbench

Delta load jobs move rows that have been added or modified since the last time the job was executed. There are many reasons to implement a delta load job, but the most common is to reduce the time, the loading process takes.

Instead of loading millions of rows each time the job is run, we can process only the few that have changed.

Another reason, to maintain historical data; we might require to keep the old data in our data warehouse and add the current state data, so that we can see the changes over time in data.

1.13    Scenarios over Delta Load types & configuration


Replication Behavior – Schema & data

Table Level ->Delta Load -> Select – No Delta Load

JobServer Properties->Select – Initial Load.

This will create a job consisting of 3 workflows.

   1. Drop the existing schema/table & create the table again.

   2. Load data into above created table. (Default Loading type = Append.)

   3. Create a table Load_Status for storing the job status & insert a new record with job Last Run End Date, if the table already exists, delete the old record & insert New Record.

Replication Behavior – Schema & data

Table Level ->Delta Load -> Select – No Delta Load

JobServer Properties->Select – Delta  Load.

This requires a Delta Load job already existing in the REPO.

Replication Behavior – Schema & data

Table Level ->Delta Load -> Select – Reload the full table

JobServer Properties->Select – Initial Load.

This will create a job consisting of 3 workflows.

  1. Drop the existing schema/table & create the table again.
  2. Load data into above created table. (Default Loading type = Append.)
  3. Create a table Load_Status for storing the job status & insert a New record with job Last Run End Date, if the table already exists, delete the old record & insert New Record.


Replication Behavior – Schema & data

Table Level ->Delta Load -> Select – Reload the full table

JobServer Properties->Select – Delta Load.

This will create a 2 jobs each consisting of 3 workflows.

Job1Initial Load

  1. Drop the existing schema/table & create the table again.
  2. Load data into above created table. (Default Loading type = Append.)
  3. Create a table Load_Status for storing the job status & insert a New record with job Last Run End Date, if the table already exists, delete the old record & insert New Record.

  Job2Delta Load

  1. Select LAST_RUN Date from table LOAD_STATUS.
  2. Load data into above created table. (Default Loading type = Truncate.)
  3. Updates the old run record with New END_DATETIME in table LOAD_STATUS.

Replication Behavior – Schema & data

Table Level ->Delta Load -> Select – Use Timestamp or Date Column

JobServer Properties->Select – Initial Load.

This will create 2 jobs each consisting of 3 workflows, where as it will Execute only Initial Load job.

Job1Initial Load

  1. Drop the existing schema/table & create the table again.
  2. Load data into above created table. (Default Loading type = Append
  3. Create a table Load_Status for storing the job status & insert a New record with job Last Run End Date, if the table already exists, delete the old record & insert New Record.

  Job2Delta Load

  1. Select LAST_RUN Date from table LOAD_STATUS.
  2. Load data into above created table. (Loading type = Truncate/Append)

  ([Table Column Date] >   $START_DATETIME) AND

([Table Column Date] <= $END_DATETIME))

  1. Updates the old run record with New END_DATETIME in table LOAD_STATUS.

Replication Behavior – Schema & data

Table Level ->Delta Load -> Select – Use Timestamp or Date Column without providing END_TIME.

JobServer Properties->Select – Delta LoadAnd Loading type = Append

This will create a 2 jobs each consisting of 3 workflows in REPO, where as it will Execute only DELTA job.

Job1Initial Load

  1. Drop the existing schema/table & create the table again.
  2. Load data into above created table. (Default Loading type = Append
  3. Create a table Load_Status for storing the job status & insert a new record with job Last Run End Date, if the table already exists, delete the old record & insert New Record.

  Job2Delta Load

  1. Select LAST_RUN Date from table LOAD_STATUS.
  2. Load data into above created table. (Loading type = Truncate/Append)

  ([Table Column Date] >   $START_DATETIME) AND

([Table Column Date] <= $END_DATETIME))

  1. Updates the old run record with New END_DATETIME in table LOAD_STATUS.

Replication Behavior – Schema & data

Table Level ->Delta Load -> Select – Use Timestamp or Date Column with providing END_TIME.

JobServer Properties->Select – Delta LoadAnd Loading type = Truncate

This will create a 2 jobs each consisting of 3 workflows in REPO, where as it will Execute only DELTA job.

Job1Initial Load

  1. Drop the existing schema/table & create the table again.
  2. Load data into above created table. (Default Loading type = Append
  3. Create a table Load_Status for storing the job status & insert a new record with job Last Run End Date, if the table already exists, delete the old record & insert New Record.

  Job2Delta Load

  1. Select LAST_RUN Date from table LOAD_STATUS.
  2. Load data into above created table. (Loading type = Truncate/Append)

  ([Table Column Date] >   $START_DATETIME) AND

([Table Column Date] <= $END_DATETIME))

  1. Updates the old run record with New END_DATETIME in table LOAD_STATUS.

1.14    Monitoring editor

The Workbench monitoring editor opens in the workspace when we click the icon in the toolbar or select it from the Tools menu.

The monitoring editor consists of several areas:

•           Job selection toolbar:  Displays the Data Services jobs available in the repository to which the Workbench is connected and allows us to execute them.

•           Execution history pane:   Displays the execution history for the selected job. Each line indicates the current status, execution date, and duration of a single job execution instance.

•           Execution history dashboard pane:   Displays a graphical representation of the history for the selected job. We can change the type of chart by choosing from the drop-down box in the upper right corner of the pane.

•           Execution details pane:  Displays the details for the individual job execution instance selected in the execution history pane.

We can view the trace, monitor, and error logs for the execution instance, as well as the execution statistics.

/wp-content/uploads/2015/04/12_679739.gif



To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply