Skip to Content

SAP Cloud for Customer Data Migration

Manual Migration

Is executed by your project team members via the regular transactions in the SAP Cloud for Customer solution

Drivers for selecting a manual migration approach:

  • Low data volumes
  • Legacy data is too unstructured
  • Migration template for a specific object is not available

Tool Supported Migration

Is executed by your project team leveraging the SAP Cloud for Customer Migration Workbench and/or Data Workbench

Drivers for selecting a template based approach:

  • High data volumes
  • Available time period to load the data in the Production system
    For example, if all data needs to be loaded over a weekend manual migration is not possible in most cases
  • Requirement for performing multiple data loads in different systems
    You can fill the template once and load it multiple times, for example, for test loads in the Test system and final load in the Production system

Initial Load (Integration)

Is executed by your project team, leveraging the iflows set up in your middleware based integration

Drivers for selecting an Integration based approach:

  • High data volumes
  • ID mapping.
  • Preferred method in case integration is in scope

Tool Supported Migration

Migration Workbench

The Migration Workbench is embedded in the Activity List of the Implementation Project in Business Configuration work center. It is used for migration and for mass change of data during production use. It is based on SOAP web services.

You can find more information on the Migration Workbench tool in the blog

Mass Data Maintenance

The Mass Data Maintenance work center view is used for mass change of data where the data is previously extracted from the system, modified in an external tool like MS Excel and written back to C4C. It is based on the Migration Workbench infrastructure.Other update scenarios are not supported and need to be performed via regular MWB.

Data Workbench

The Data Workbench work center is a one stop shop for all data migration and data maintenance needs. It is based on OData web services. Data Workbench should be the preferred mode of data migration in the case of the following scenarios –

  • Import of Custom Business Objects
  • Import of Attachments (whenever byte type attachment files are used)
  • Import of Business Objects not supported by Migration Workbench –
    • Sales: Marketing Attribute, Marketing Permission, Sales order, Sales Territory, Deal Registration
    • Service: Contracts, Measurement points, Measurement Documents, Maintenance Plan, Skills Master Data, Skills in Employee /Customer
  • Continuous Updates (i.e. individual object updates after initial load)
  • Export of business objects for mass updates

The Data Workbench tool is available via the Data Workbench Work Centre.

With the data workbench, you can –

  • Import records or update existing records into the system for standard and custom objects.
  • Export records from the system
  • Create templates to maintain reusable field and code list mappings*
  • Monitor the progress of the migration activity

* Currently supported only for individual BO root & sub-nodes

How to use the Data Workbench tool?

Creating Reusable Templates

A template is a mapping of the fields between two files (or systems). The data to be imported into C4C may come from a system which could be managing it in a specific format. In such a scenario, it could be easier to create a mapping between these formats which can be re-used for data import whenever required. You can use templates to preconfigure field mappings and codelist mappings between the columns in the .csv file and the Business Object node. This template can then be used for importing the data without the need to create a mapping again.

Once a template is created, it can be copied, renamed, updated and deleted.

The upload/download functionality can be used to reuse the template across multiple C4C tenants. The downloaded file has an extension of type .dwbtmpl

You can access the metadata .csv file by selecting the template and clicking on the metadata action. You can then maintain the field mappings and codelist mappings using the .csv file instead of working on it online, via the tool.

You can start the creation of a reusable template by selecting the relevant Business Object node and uploading a data file containing header information with or without data records.

 

You can then maintain new mappings or change existing mappings between the .csv format and the Business object node structure

You can also maintain the internal and external code list mappings for the corresponding fields. This code list mapping can also be saved as a template (using the Save as template option) which will appear in the Templates workcentre.

If you have an existing code list mapping template that you have saved with your earlier migration activity and it is relevant for this field, you can reuse it by selecting it using the dropdown.

You can also configure the default internal code value for the external codes that have not been mapped using the option – Select Internal Code for unmapped fields in the UI.

If you do not have a reusable codelist template and you want to work on the codelist mapping offline due to the fact that the mapping is expected to have many entries (country code for example), you can download the same in a .csv file, maintain the mappings and upload them back into the tool using the Download and Upload options provided in the UI.

Note: The creation of templates are supported only for individual BO root & sub-nodes; They are not available in complete BO import

Import

With this functionality, you can import a complete business object. For example, import of Accounts with all Key header information, Addresses, Account team, Sales data etc.,

Import Complete Business Object

This option should be used to import data for all the nodes of a Business Object. For example, if you choose Accounts, you can import the Account header information, Addresses, Contacts, Sales Data etc at once.

Typical use case for complete BO import is during Initial data load. It is also used when you have introduced a new process in C4C and you want to import the data for the associated Business Objects.

It is mandatory to specify the External Key for all the records that you want to import. As the External Keys are used to maintain association between root & sub-nodes, this will ensure easy import of sub node data using the parent external key as reference.

Click on the download button to download a zip file onto the selected path. The zip file will contain the folders for maintaining the records to be imported, the codelist mappings and the field definitions for the objects

If you choose the import of individual object, there would be no template and codelist folders. The corresponding .csv files would be included in the zip file directly.

The templates folder contains individual .csv files containing all the fields/columns required for import. The records that need to be imported are maintained in this .csv file

The codelist folder contains individual .csv files for maintaining the codelist mappings for the various fields

The Field Definition folder contains individual .csv files for each node. They help in guiding you for the migration activity by providing the mapping between the names in the .csv file and the UI. It also indicates the codelist mapping used for the field(s).

The import operations – Insert, Upsert and Update will also be available for the migration of Complete Business Objects in future releases

Import Individual Objects

This option should be used when you want to import or enhance the data for a business object node. Typical use case for this mode of import is during Cutover migration or after go live.

With this functionality, you can import Business Object node data for example Sales Data for Accounts

The import operation can be classified into 3 types:

Insert

You can use this operation when you are importing data into the system for the first time and you want to create new records.

If you have pre-defined mappings for the fields in the field and the BO maintained in a template, you can select the same during the import process.

External Keys should be maintained in the .csv file during the initial import of the object (especially root node information).

Upsert

You can use this operation to create new data or update existing data especially when you are not clear if the data exists in the system or not. For example, addition of a new member to an Account sales team. It is mandatory to provide the External keys to avoid duplicate records during subsequent imports.

When you want to insert or upsert the data, you can either use the .csv file obtained through the Download Metadata action or use templates preconfigured in the system.

Update

You can use this operation to update the data that exists in the system. For example, update of mobile numbers of existing contacts.

For updating the data, export the data using the DWB tool. Make the changes in the .csv file and upload it

You can select either of the two update modes supported. Choose the Ignore Blank Values option if you want the tool to not change the values of the field in the database or in other words ignore all the blank values provided for a specific field in the csv file. Choose the Update Blank Values option if you want the tool to overwrite the values in the database with blank values.

Note: You can insert new entries using the update operation by keeping the Object ID field blank especially in cases where you do not have external key of the root node.

Consider the scenario where you need to add an existing territory in the sales territories for an Account. You should export the Sales territory node for the Account using the Export functionality. You should then update the information for the territory in the file, remove the value in the object ID field and then upload the file in the Update workcentre. This will ensure that the new sales territories are inserted in the business object node.

In order to delete a record, add an additional column – ToBeDeleted at end of the exported .csv file and mark the records to be deleted as True.

Import Attachments

You can refer to this blog link to understand how to import attachments into C4C using the data workbench tool.

Export

You can use the export functionality to export the Business Object node data into .csv files. You can restrict the data to be exported by specifying filters to restrict the data. For example, export only those Account headers where the Account Classification is A Accounts.

Monitor

The Monitor view is a one stop shop for monitoring the results of all the operations triggered from the data workbench workcentre.

All the tasks are provided a unique ID consisting of the object that is being worked upon and the timestamp of the operation. The type of the operation is categorized in the format – <Operation> <object type> for example Insert Individual object/Import Complete Business object/Export. The business subject node details is also mentioned along with the .csv file containing the information on the data records imported or exported into the system.

The status indicates if the migration activity has Not Started, is In Process, has been processed already Data Processed, or completed Finished/Interrupted.

The elapsed time gives an indication on the time taken to reach the finished state. Elapsed time is hidden by default & needs to be enabled from user personalization.

Other additional key information provided is the total number of records that are processed and not processed and if they were processed, was it a successful operation or an erroneous one. In case of an error, a suitable message is also displayed alongside the record. The unprocessed and errored out records are available in a .csv file for further analysis and processing.

To report this post you need to login first.

10 Comments

You must be Logged on to comment or reply to a post.

  1. Former Member

    SAP is big software for managing business operations and customer relation. For online assignment help I use sap software which is good for customer data migration and tool supported migration. I am IT graduate student and i worked as an internee and work on SAP. Got alot of information through this platform. Thanks.

    (0) 
  2. Former Member

     

    Hi Krishnan,

    thanks for this excellent explanation of the the data workbench. Update works perfectly fine – I’ve done this for account team members, but I was wondering how to delete existing account team members from the account team tab. Do you have a hint how to delete those assignments ?

    thanks Gabriele

    (0) 
    1. Dhruvin Mehta

      Hi,

      In order to delete a record, add an additional column – ToBeDeleted at end of the exported .csv file and mark the records to be deleted as True.

      I think when u download the file there should be one column mark it true and then do the update , it should work, If it doesn’t work ,raise an SCN Discussion 🙂

      Regards,

      Dhruvin

      (0) 
  3. Dhruvin Mehta

    Hey,

    Nice information. Data workbench is really nice and fast tool ( May be due to oData directly hitting hana db not sure though 😉  )  .

    I sometimes face a strange issue in DWB , once i upload the file and while monitoring the data just takes unbelievably long time to get updated , like one day for 18k entries it took less then 5-10 mins but while updating in same individual object for 1k records it took more then 2 hours. Is there a way we can analyze what is happening where it is stuck in monitoring?

     

    Regards,

    Dhruvin

    (0) 
  4. Deborah Albrecht

    Hi Kavya Krishnan, 

    thank you for the great blog!

    One question: How to extend the data workbench with custom fields from the Key-User Tool (Adapt – Edit Masterlayout)?

    I read the thread https://archive.sap.com/discussions/thread/3881535 and there it’s stated that the odata service has to be extended with the custom field through the field definitions (Odata-Service – c4codata). However, I extended the odata service with our custom field and can see this in the metadata (https://<c4c tenant base URL>/sap/byd/odata/v1/c4codata/$metadata), but I cannot see the custom field in the data workbench while filtering the data or in the exported file.

    What to do – can you recommend something?

    Regards,

    Deborah

    (0) 
    1. Kavya Krishnan
      Post author

      Dear Deborah,

      Sorry for the delay. Please make sure that the odata service that you select is dataworkbench enabled. You will be able to check this in Administrator->OData Service Explorer.

      Regards,

      Kavya

       

       

      (0) 
      1. Deborah Albrecht

        Dear Kavya,

        thanks for your reply.I checked the Odata Service Explorer. There I see that sap odata service: c4codata and c4codataapi are both not data-workbench enabled, but cannot change this (greyed out). So does this mean, if I want to use custom fields in data workbench I have to create a custom odata-service??

        Regards,

        Deborah

         

        (0) 
        1. Kavya Krishnan
          Post author

          Dear Deborah,

          Yes, for now, you need to create a custom Odata service. However, we do have application teams creating DWB enabled OData services that will appear independently in the explorer.

          Regards,

          Kavya

          (0) 

Leave a Reply