Skip to Content

In the first version of the S/4HANA Migration Cockpit, the Data Sources have always been excel spreadsheets.

see blog https://blogs.sap.com/2017/02/28/getting-started-with-the-s4hana-migration-cockpit-onpremise/ for an introduction to the S/4HANA Migration Cockpit

Now you can also use Database Tables as a source for your Migration Project. This are so called ‘Staging Tables’, so you extract the data from the source system into these staging tables and import the data from there with the S/4HANA Migration Cockpit.

The Advantages are:

  • faster overall process (export/import), less clicks
  • Performance
  • Use Database Tools to Extract and Transform

Attention: This Feature is available from minimum Release S/4HANA 1709 FPS1

 

how does it work?

First we need to create a Database Connection. This can be done by /NDBCO or /NDBACOCKPIT

Enter the Connection Parameters to the Staging Database, the User is also the Owner of the used DB-Schema.

The Staging tables can exist in a remote database or in the target S/4HANA database (but in a separate Schema)

 

Example:

 

Now we can create a new Migration Project in Transaction /NLTMC:

select Staging Tables here and the Database Connection.

 

when opening a Migration Object, in this case ‘Customer’, the Database Tables are created automatically based on the import structure description of the Migration Object

you can also check these structures (or even enhance them) with the Migration Object Builder, check https://blogs.sap.com/2017/06/20/discover-the-s4hana-migration-cockpit-migration-object-modeler-onpremise-nltmom/

now we can fill the tables with any database tool.

show structure with SE11:

 

show structure with hana studio:

if you select one structure you can switch the processing status and verify the records:

here you can see the records:

 

Activities:

Start Transfer: all the steps including mapping are performance the same way this works for the excel import.

Delete Records: you can delete the import records from the migration cockpit

Restart Transfer: resets the migration status, so you can delete or update the table records. this is the case if you see this message:

 

Start Synchronization: synchronize structure with migration object builder (in case of new fields have been added etc), no data will be lost.

 

link to sap help:

https://help.sap.com/viewer/29193bf0ebdd4583930b2176cb993268/1709%20001/en-US/d5feccf64c9a41b2b95c908268dfe27e.html

 

Specific Notes/Restrictions:

2587257 FPS01 S/4HANA 1709 FPS01 – Migration – Correction for Migration Cockpit – Staging Scenario

2608495 – SAP S/4HANA Migration Cockpit: Errors using staging functionality w/ pre-delivered data migration objects in on-premise release 1709 FPS1

The following known pre-delivered migration objects will run on errors and are not released to be used with transfer option Staging table migration projects:

  • SIF_LSTAR_CREATE – Activity type
  • SIF_PRCTR_MASTER – Profit center
  • SIF_KOSTL_MASTER – Cost center
  • SIF_WORK_CNTR – Work center
  • SIF_MATERIAL – Material
  • SIF_MATERIAL_EXTEND – Material – extend existing record by new org. levels
  • SIF_INVENTORYBAL – Material inventory balance

It is currently planned to solve the issues on these migration objects with SAP S/4HANA 1709 FPS2

 

 

In case you want to export data from an existing R/3 system from abap, this could look like this:

i used rfc-function modules to dynamically define the structures based on the target S/4-system (where the migration cockpit is), and ADBC-SQL to insert the records, as the structures don’t exists on the source-systems data dictionary. of course you could transport these structures to the source systems data dictionary, but at the moment this would be a manual process.

REPORT Z_EXPORT_CUST_STAGING.
data: lv_filename type string.
data: gt_data like table of TAB512.

data: begin of wa_strucid,
COBJ_IDENT(20),
STRUCT_IDENT(20),
STAGING_TAB(30),
IN_USE(1),
end of wa_strucid.
data: gt_strucid like table of wa_strucid.

TYPE-POOLS : abap.
FIELD-SYMBOLS: <dyn_table> TYPE STANDARD TABLE,
<dyn_wa>,
<dyn_field>.
DATA: dy_table TYPE REF TO data,
dy_line  TYPE REF TO data,
xfc TYPE lvc_s_fcat,
ifc TYPE lvc_t_fcat.
FIELD-SYMBOLS: <dyn_db_table> TYPE STANDARD TABLE,
<dyn_db_wa>,
<dyn_db_field>.
DATA: dy_db_table TYPE REF TO data,
dy_db_line  TYPE REF TO data.

field-symbols: <source> type any, <target> type any.

parameters: p_ident(20) default 'Z_CUSTOMER_007'.
parameters: p_dbcon like dbcon-CON_NAME.
parameters: p_rfcdes like rfcdes-rfcdest default 'S4V_MASTER'.

* KEY
data: ls_kna1 type kna1.
select-options: s_kunnr for ls_kna1-kunnr.

start-of-selection.

perform create_structures using 'S_CUST_GEN' 'KNA1'.

* append strukturen?

select * from KNA1 into TABLE <dyn_db_table>
where
KUNNR in s_kunnr and
loevm = ' '.
perform transfer_selected_data using 'KNA1'.
perform transfer using 'S_CUST_GEN'.

form transfer_selected_data using dbtabname.

LOOP AT <dyn_db_table> ASSIGNING <dyn_db_wa>.
clear <dyn_wa>.
move-corresponding <dyn_db_wa> to <dyn_wa>.

perform mapping_rules_after using dbtabname <dyn_db_wa> <dyn_wa>.

append <dyn_wa> to <dyn_table>.

endloop.

endform.

form create_structures using tabname dbtabname.
* DDIF_FIELDINFO_GET "RFC Destination
data: lt_dfies like table of dfies.
data: ls_dfies like dfies.
data: lv_tabname type DDOBJNAME.

data:     OPTIONS like table of RFC_DB_OPT.
data:     FIELDS like table of  RFC_DB_FLD.

refresh ifc.

if gt_strucid is initial.
CALL FUNCTION 'RFC_READ_TABLE'
destination p_rfcdes
EXPORTING
QUERY_TABLE                = 'DMC_STAG_STRUCID'
*   DELIMITER                  = ' '
*   NO_DATA                    = ' '
*   ROWSKIPS                   = 0
*   ROWCOUNT                   = 0
TABLES
OPTIONS                    = options
FIELDS                     = fields
DATA                       = gt_data
EXCEPTIONS
TABLE_NOT_AVAILABLE        = 1
TABLE_WITHOUT_DATA         = 2
OPTION_NOT_VALID           = 3
FIELD_NOT_VALID            = 4
NOT_AUTHORIZED             = 5
DATA_BUFFER_EXCEEDED       = 6
OTHERS                     = 7
.
IF SY-SUBRC <> 0.
* Implement suitable error handling here
ENDIF.
refresh gt_strucid.
loop at gt_data into wa_strucid.
append wa_strucid to gt_strucid.
endloop.

endif.

read table gt_strucid into wa_strucid with key cobj_ident = p_ident struct_ident = tabname.
if sy-subrc <> '0'. exit. endif.
lv_tabname = wa_strucid-staging_tab.

CALL FUNCTION 'DDIF_FIELDINFO_GET'
destination p_rfcdes
EXPORTING
TABNAME              = lv_tabname
*    FIELDNAME            = ' '
*    LANGU                = SY-LANGU
*    LFIELDNAME           = ' '
*    ALL_TYPES            = ' '
*    GROUP_NAMES          = ' '
*    UCLEN                =
*    DO_NOT_WRITE         = ' '
*  IMPORTING
*    X030L_WA             =
*    DDOBJTYPE            =
*    DFIES_WA             =
*    LINES_DESCR          =
TABLES
DFIES_TAB            = lt_dfies
*    FIXED_VALUES         =
EXCEPTIONS
NOT_FOUND            = 1
INTERNAL_ERROR       = 2
OTHERS               = 3
.
IF SY-SUBRC <> 0.
* Implement suitable error handling here
ENDIF.
loop at lt_dfies into ls_dfies.

CLEAR xfc.
move-corresponding ls_dfies to xfc.
apPEND xfc TO ifc.

endloop.

perform create_dynamic_itab.

CREATE DATA dy_db_table TYPE TABLE OF (dbtabname).
ASSIGN dy_db_table->* TO <dyn_db_table>.
CREATE DATA dy_db_line LIKE LINE OF <dyn_db_table>.
ASSIGN dy_db_line->* TO <dyn_db_wa>.

endform.

*&---------------------------------------------------------------------*
*&      Form  create_dynamic_itab
*&---------------------------------------------------------------------*
FORM create_dynamic_itab.
* Create dynamic internal table and assign to FS
CALL METHOD cl_alv_table_create=>create_dynamic_table
EXPORTING
it_fieldcatalog  = ifc
i_length_in_byte = 'X' "added by Paul Robert Oct 28, 2009 17:04
IMPORTING
ep_table         = dy_table.
ASSIGN dy_table->* TO <dyn_table>.
* Create dynamic work area and assign to FS
CREATE DATA dy_line LIKE LINE OF <dyn_table>.
ASSIGN dy_line->* TO <dyn_wa>.
ENDFORM.                    "create_dynamic_itab

form transfer using table .

data: lv_dbtabname(50).

read table gt_strucid into wa_strucid with key cobj_ident = p_ident struct_ident = table.
if sy-subrc <> '0'. exit. endif.
lv_dbtabname = wa_strucid-staging_tab.

data: sqlerr_ref type ref to cx_sql_exception.
data: msgtext type string.

try.

DATA:
l_stmt         TYPE string,
l_prepstmt_ref TYPE REF TO cl_sql_statement,
l_dref         TYPE REF TO data.

data:
p_con_ref TYPE REF TO cl_sql_connection.

p_con_ref = cl_sql_connection=>get_connection( p_dbcon ).

* create the statement string
CONCATENATE
'insert into "' lv_dbtabname '" values (?' into l_stmt.
loop at ifc into xfc.
if sy-tabix > 1.
concatenate l_stmt ',?' into l_stmt.
endif.
endloop.
concatenate l_stmt ')' into l_stmt.

* create a prepared statement object
l_prepstmt_ref = p_con_ref->prepare_statement( statement = l_stmt   ).

* bind input variables
GET REFERENCE OF <dyn_table> INTO l_dref.

l_prepstmt_ref->set_param_table( l_dref ).
l_prepstmt_ref->execute_update( ).

CATCH cx_sql_exception INTO sqlerr_ref.
*      cl_db6_con=>handle_exception( sql_stmt   = sql_stmt
*                                    sqlerr_ref = sqlerr_ref ).
msgtext = sqlerr_ref->sql_message.
write: / msgtext.
ENDTRY.

endform.
form mapping_rules_after using dbtabname db_wa wa.

case dbtabname.
when 'KNA1'.
perform map_field using 'NAMORG1' 'NAME1' wa db_wa .
perform map_field using 'NAMORG2' 'NAME2' wa db_wa .
perform map_field using 'STREET' 'STRAS' wa db_wa .

perform map_field using 'POST_CODE1' 'PSTLZ' wa db_wa .
perform map_field using 'CITY1' 'ORT01' wa db_wa .
perform map_field using 'COUNTRY' 'LAND1' wa db_wa .
perform map_field using 'REGION' 'REGIO' wa  db_wa.

perform map_field using 'STR_SUPPL1' 'NAME3' wa db_wa .
perform map_field using 'PO_BOX' 'PFACH' wa db_wa .
perform map_field using 'LANGU_CORR' 'SPRAS' wa db_wa .
perform map_field using 'TELNR_LONG' 'TELF1'  wa db_wa.

perform map_field using 'FAXNR_LONG' 'TELFX'  wa db_wa.

perform map_field using 'SUPPLIER' 'LIFNR'  wa db_wa.
perform map_field using 'REGION' 'REGIO' wa db_wa .

assign component 'BU_GROUP' of structure wa to <source>.
if sy-subrc = '0'.

<source> = '0001'.
endif.

endcase.

endform.
form map_field using target_field  source_field  wa db_wa .
field-symbols: <fieldvalue>, <source>.
assign component target_field of structure wa to <fieldvalue>.
if sy-subrc = '0'.
assign component source_field of structure db_wa to <source>.
if sy-subrc = '0'.

<fieldvalue> = <Source>.
endif.
endif.
endform.

 

To report this post you need to login first.

17 Comments

You must be Logged on to comment or reply to a post.

  1. Michelle Crapo

    I love blogs like this with the screens and the code.   Could you do me a favor? I’m lazy and like to copy the code easily. Could you cut and paste the code into a code sample.

    Thank you!

    Michelle

     

    (0) 
  2. Frank Finkbohner

    Hi Jörg,

    The SAP Note 2608495 was updated today with some additional information to NUMC fields in data migration objects. If you fill values of NUMC fields directly on data base level via HANA Studio (CSV upload) you will have to take care that leading zeroes are added to that values in the file. The ABAP DB layer will not add this when reading that data from the database. The same for initial values. Date, Time, NUMC, Decimal fields must be filled with the initial value if they are empty and the table definition in HANA shows a DEFAULT value.If load the data via CSV upload you have to take care of this as you are acting directly on the database level and not via the ABAP DB layer.

    BR

    Frank

    (0) 
      1. Pete Molnar

         

        Actually we are moving to HANA, but we have an older version of Oracle that we want to migrate from, will we have a problem connecting to an older version?

        (0) 
  3. Vipin Nagpal

    Hi,

    Is it possible to create staging table in primary database instead of creating in secondary database? I am asking this question for learning perspective. I know in real time, only secondary database should be use.

    Thanks

    (0) 
    1. Sybille Lackermeier

       

      Hi Vipin,

      Theoretically it would be possible to use the primary database – but this is for security reasons not advisable. SAP S/4HANA Migration Cockpit will only indicate database connections that were created via DBCO.

      Regards, sybille

      (0) 
  4. Former Member

     

    Hi Jörg Knaus

    I found your blog  really helpfull, thanks for it, do you tried to create a supplier and supplier with the same code using the cockpit, because it keeps showing me error, but I dont know what I did wrong.

    (0) 
  5. Thomas Pillich

    Hi Jörg,

    I was wondering about the naming conventions for the HANA-Staging tables created automatically by Migration Cockpit.

     

    Usually we have a 3-tier target environment: DEV-QAS-PRD and I understand that we cannot export and import the MC project from DEV to QAS and from QAS to PRD if staging tables are in place.

     

    So I am wondering if the automatically generated HANA staging tables have the same name from all 3 different environments (generated from MC in DEV + QAS+ PRD).

     

    What is the exact naming convention? let’s say for bank master. Is there always the <SYD> included?

     

    So if I fill the HANA staging table for bank master from the source system (only one source) do I require 3 different DS jobs because I first fill the staging table for migration into S/4 HANA DEV and another DS job to fill staging table for migration into S/4 HANA QAS and a third job to fill staging table for migration into S/4 HANA PRD?

     

    Would be easier and re-usable the DS job if the HANA staging tables for the same objects have the same name for  DEV/QAS/PRD load.

    What would the the correct approach?

     

    Thanks for early reply!

     

    Thomas

     

    (0) 
    1. Sybille Lackermeier

       

      Hello Thomas

      As the staging table gets generated in each system, the name of the table will differ to ensure  uniqueness.

      The staging tables are generated in namespace /1LT/STAG. As a consequence you have to adjust the data extraction program accordingly for every test run.

      We are aware that this may present risks to the project team in case someone forgets to adjust the extraction program. Therefore we are going to investigate if there is a possibility to manually specify the name of the staging tables in customer namespace.

      best regards,

      sybille

      (0) 
      1. Thomas Pillich

        Hi Sybille,

         

        Did not find in SE11 any tables in namespace   /1LT/STAG. Only with /1LT/DSS.

        Examples:

        /1LT/DSS4B000000            Z_KOSTL_MASTER_00B:S_CSKS
        /1LT/DSS4B000001            Z_PRCTR_MASTER_00B:S_CEPC
        /1LT/DSS4B000002            Z_PRCTR_MASTER_00B:S_CEPC_BUKR
        /1LT/DSS4B000003            Z_BANK_MASTER_00B:S_BNKA
        /1LT/DSS4B000004            Z_LSTAR_CREATE_00B:S_CSLA
        /1LT/DSS4B000005            Z_AP_OPEN_ITEM_00B:S_BSIK
        /1LT/DSS4B000006            Z_BANK_MASTER_00H:S_BNKA
        /1LT/DSS4B000007            Z_KOSTL_MASTER_00H:S_CSKS
        /1LT/DSS4B000008            Z_PRCTR_MASTER_00H:S_CEPC
        /1LT/DSS4B000009            Z_PRCTR_MASTER_00H:S_CEPC_BUKR

         

        So the staging table name nor description does not contain the <SYD> like DEV, QAS or PRD?

        It is just a correlative number in the table name  and we have no idea to which environment the staging table refers?

         

        Is there a way to see which staging tables belong to Migration Cockpit in DEV in QAS and in PRD system?

         

        Sorry for bothering….

         

        Thomas

        (0) 
        1. Thomas Pillich

          Hi Sybille,

           

          I found the DMC-Tables with the staging table names. However there is no SYD-field /MANDT-field – or at least I did not find it.

           

          So my question is:

           

          If we have in DEV-, QAS- and PRD the same Migration cockpit  project name, Structure and Description (export + import of project from DEV to QAS and PRD): how can we distinguish the 3 different staging tables that we have to populate  with SAP Data Services, Linked Server or SDI?

          I guess that the 3 staging tables for the same project and object in the 3 environments just have a different number (/1LT/DSSXX000010 in DEV , /1LT/DSSXX000030 in QAS , /1LT/DSSXX000060 in PRD) .

          I can implement a custom function in SAP DS to point to the right staging table, but which are the selection field for the select of the DMC-Tables?

           

          Any hint or suggestion is highly appreciated.

           

          Thanks,

           

          Thomas

          (0) 
  6. Suman Ogeti

    Hi Jörg,

    Morning, How are you? I need your expertise and suggestion, I activate MC Staging model for Fixed Assets( Inc Balances), staging tables are generated and Cumulative values staging table is not allowing decimal places to any depreciation amount fields, its rounding all our depreciation values when I load them into the staging. The Fields using for amounts in staging are generated with DEC data type but decimals places as 0.

    I checked in Source structure ( LTMOM) and also downloaded file content, they both allowing decimal places.

    What would be the correct approach and appreciate for early reply 🙂

    (0) 

Leave a Reply