Skip to Content

Client copy process does not include change documents since version 4.5B as the field OBJECTID of table CDHRDR sometime contain the client number (see note 180949 – CC-INFO: Change documents for a client copy).

All the copy that use a profiles that includes business data will delete all the entries from the change document tables for the target.

I’ve been asked many times to copy the change documents and I did perform some checks, it appears than most of the time OBJECTID field does not contain the client number (around 80%, YMMV).

Even if the tool I’m using is supported by SAP, copying change documents is not supported… but it won’t hurt.

Oracle import/export tool are not recommended anymore on 11g (note 1712612 – Oracle 11g: Usage of exp with 11.2 / 11g ). It’s successor, datapump did bring some very interesting features.

  • The remap option can be used to apply a specific function to a field of the table imported/exported. I’ve created a dummy function (REMAP_MANDANT.chg_cli) to use remap function for modifying the client field (MANDANT) of the change documents tables.
  • The network_link option allows to perform an import from a remote DB over a database link. This does allow to directly import data from a remote DB without needing to perform an export and move/share the data with the target system.
  • The query option did already exist on exp/imp, it allows to filter the data you are willing to export/import.

Combining these options allows in one command to

  • import records from one specific client of a source system,
  • copy into the target system,
  • change the content of the MANDANT field.

impdp system/pwd@SID tables=sapsr3.cdcls directory=datapump_directory remap_data=sapsr3.cdcls.mandant:system.remap_mandant.chg_cli logfile=imp_cdcls.log parallel=4  network_link=src_link query=”‘ where mandant = ”230”'”

The here under procedure should be run from the target system where you wish to import change doc.  It could/should be used right after a client copy while user & batch activity is still supposed to be suspended in both source and target client.

1) Export the original tables as a backup

2) Delete all entries from change document tables for the target client

If your target system do only have one client you can use the truncate command that is faster.

3) Perform direct import over DB link

4) Cleanup created objects

If performed just after a client copy steps 1 & 2 won’t be necessary as the client copy process would have already delete all the entries.

When the import is done you will need to refresh table buffers on all the instances of the target SAP system (/$tab)

Performance is not that high, at first because some the options used (network_link & query) will make datapump using the external tables method instead of the faster direct path method.

On a medium size system it took 1.5 hours to copy a 4 GB CDLS table.

I’m sure that the nice datapump functions demonstrated here could be helpful for many other usages.

Steps to perform on a Windows system

1)

set “src_SID=SRC”

set “des_SID=DES”

set “des_cli=500

set “src_cli=230

set “src_opwd=2hard2guess”

set “des_opwd=try2guess”

set “src_host=src-host”

set schema=SAPSR3″

set “ora_port=1527”

Cd /d D:\Script\cli_cop & mkdir %des_cli%\chg_doc & cd %des_cli%\chg_doc

echo CREATE OR REPLACE DIRECTORY datapump_directory AS ‘D:\Script\cli_cop\%des_cli%\chg_doc’; > pre.sql

echo GRANT READ, WRITE ON DIRECTORYdatapump_directory TO system; >> pre.sql

echo Exit >> pre.sql

sqlplus “/as sysdba” @pre.sql

::Export tables to flat files with a parallelism degree of 4

For /D %T in (CDPOS_UID CDHDR CDCLS) do  (

expdp system/%des_opwd% Tables=%schema%.%T directory=datapump_directory dumpfile=%T^%U.dmp logfile=exp_%T.log PARALLEL=4

)

2)

For /D %T in (CDPOS_UID CDHDR CDCLS) do (

Echo delete from %schema%.%T where MANDANT = ‘%des_cli%’;>> trunk.sql

Echo –truncate table %schema%.%T; >> trunk.sql

)

echo exit >> trunk.sql

sqlplus “/as sysdba”  @trunk.sql

3)

:: sql script to create the directory for datapump

echo CREATE OR REPLACE DIRECTORY datapump_directory AS ‘D:\Script\cli_cop\%des_cli%\chg_doc’; > pre.sql

echo GRANT READ, WRITE ON DIRECTORY datapump_directory TO system; >> pre.sql

echo Connect system/%des_opwd%>> pre.sql

:: create the DB link

echo CREATE DATABASE LINK src_link CONNECT TO system IDENTIFIED BY %src_opwd% >> pre.sql

echo USING ‘(DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST =%src_host%)(PORT = 1527))) (CONNECT_DATA = (SID = %src_SID%) ) )’; >> pre.sql

:: create the remap package

echo CREATE OR REPLACE PACKAGE REMAP_MANDANT AS>> pre.sql

echo FUNCTION chg_cli(ori_cli VARCHAR2) return VARCHAR2; >> pre.sql

echo END REMAP_MANDANT; >> pre.sql

echo />> pre.sql

:: create the remap function

echo CREATE OR REPLACE PACKAGE BODY REMAP_MANDANT AS >> pre.sql

echo FUNCTION chg_cli(ori_cli VARCHAR2) return VARCHAR2 IS>> pre.sql

echo new_cli VARCHAR2(9); >> pre.sql

echo BEGIN>> pre.sql

echo new_cli := ‘%des_cli%’; >> pre.sql

echo RETURN new_cli; >> pre.sql

echo END; >> pre.sql

echo END REMAP_MANDANT; >> pre.sql

echo / >> pre.sql

echo Exit >> pre.sql

sqlplus “/as sysdba” @pre.sql

:: Loop on the list of tables to refresh

For /D %T in (CDPOS_UID CDHDR CDCLS) do (

impdp system/%des_opwd%@%Des_SID% Tables=%schema%.%T directory=datapump_directory REMAP_DATA=%schema%.%T.MANDANT:SYSTEM.REMAP_MANDANT.chg_cli logfile=imp_%T.log PARALLEL=4 table_exists_action=append content=data_only network_link=src_link query=“‘ where MANDANT = ”%src_cli%”'”

)

4)

echo revoke READ, WRITE ON DIRECTORY datapump_directory from system; > post.sql

echo Drop DIRECTORY datapump_directory; >> post.sql

echo DROP PACKAGE BODY SYSTEM.REMAP_MANDANT; >> post.sql

echo DROP PACKAGE SYSTEM.REMAP_MANDANT; >> post.sql

echo Connect system/%des_opwd%>> post.sql

echo Drop DATABASE LINK src_link; >> post.sql

echo Exit

sqlplus “/as sysdba” @post.sql

Steps to perform on a U*x system (Cshell)

1)

setenv src_SID PRD

setenv des_SID QAS

setenv des_cli 200

setenv src_cli 300

setenv src_opwd asthough

setenv des_opwd thoughone

setenv src_host host-prd

setenv schema SAPSR3

setenv ora_port 1521

mkdir -p /tmp/cli_cop/${des_cli} && chmod 775 /tmp/cli_cop/${des_cli} && cd /tmp/cli_cop/${des_cli}

echo “CREATE OR REPLACE DIRECTORY datapump_directory AS ‘/tmp/cli_cop/${des_cli}’;” > pre.sql

echo “GRANT READ, WRITE ON DIRECTORY datapump_directory TO system;” >> pre.sql

echo Exit >> pre.sql

sqlplus “/as sysdba” @pre.sql

foreach table (CDPOS_UID CDHDR CDCLS)

  expdp system/${des_opwd} tables=${schema}.${table} directory=datapump_directory \

  dumpfile=${table}%U.dmp logfile=exp_${table}.log query=”‘ \

  where MANDANT = ”${des_cli}” ‘” PARALLEL=4

end

2)

foreach table (CDPOS_UID CDHDR CDCLS)

  echo “delete from ${schema}.${table} where MANDANT = ‘${des_cli}’;>> trunk.sql

  echo “–truncate table ${schema}.${table}; >> trunk.sql

end

echo exit >> trunk.sql

sqlplus “/as sysdba” @trunk.sql

3)

echo “CREATE OR REPLACE DIRECTORY datapump_directory AS ‘/tmp/cli_cop/${des_cli}’; > pre.sql

echo “GRANT READ, WRITE ON DIRECTORY datapump_directory TO system; >> pre.sql

echo “Connect system/${des_opwd}”>> pre.sql

echo “CREATE DATABASE LINK src_link CONNECT TO system IDENTIFIED BY ${src_opwd}” >> pre.sql

echo “USING ‘(DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST =${src_host})(PORT = ${ora_port}))) (CONNECT_DATA = (SID = ${src_SID}) ) )’; >> pre.sql

echo “CREATE OR REPLACE PACKAGE REMAP_MANDANT AS>> pre.sql

echo “FUNCTION chg_cli(ori_cli VARCHAR2) return VARCHAR2; >> pre.sql

echo “END REMAP_MANDANT; >> pre.sql

echo “/>> pre.sql

echo “CREATE OR REPLACE PACKAGE BODY REMAP_MANDANT AS >> pre.sql

echo “FUNCTION chg_cli(ori_cli VARCHAR2) return VARCHAR2 IS>> pre.sql

echo “new_cli VARCHAR2(9); >> pre.sql

echo “BEGIN>> pre.sql

echo “new_cli := ‘${des_cli}’; >> pre.sql

echo “RETURN new_cli; >> pre.sql

echo “END; >> pre.sql

echo “END REMAP_MANDANT; >> pre.sql

echo “/ >> pre.sql

echo “Exit >> pre.sql

sqlplus “/as sysdba” @pre.sql

foreach table (CDPOS_UID CDHDR CDCLS)

  impdp system/${des_opwd}@${des_SID} tables=${schema}.${table} \

  directory=datapump_directory \

  REMAP_DATA=${schema}.${table}.MANDANT:SYSTEM.REMAP_MANDANT.chg_cli \

  logfile=imp_${table}.log table_exists_action=append content=data_only \

  network_link=src_link query=“‘ where MANDANT = ”${src_cli}” ‘” PARALLEL=4

end

4)

echo revoke READ, WRITE ON DIRECTORY datapump_directory from system;> post.sql

echo Drop DIRECTORY datapump_directory;>> post.sql

echo DROP PACKAGE BODY SYSTEM.REMAP_MANDANT;>> post.sql

echo DROP PACKAGE SYSTEM.REMAP_MANDANT;>> post.sql

echo “Connect system/${des_opwd}”>> post.sql

echo Drop DATABASE LINK src_link; >> post.sql

echo Exit>> post.sql

sqlplus “/as sysdba” @post.sql

To report this post you need to login first.

3 Comments

You must be Logged on to comment or reply to a post.

  1. Glennard Simpelo

    Hi Yves,

    This is very good stuff.

    Apart from the change document tables, have you tried it for othertables like COEP and EDIDC? I think the remap option of impdp can also do its magic for field MANDT by following the same approach.

    Also, after doing export/import of excluded tables from client copy; running bdls for thesetables should be considered as well?

    (0) 
  2. Yves KERVADEC Post author

    Hello Glennard,
    Yes, that method could also be used to copy some heavy tables to speed up a client copy. I did use it for that purpose, I did exclude the biggest tables from the client copy and process them manually with Datapump. Datapump parallel processing abilities allows to copy the tables very fast compared to the SAP client copy process.
    You still need to perform the logical system conversion (BDLS) afterward. It could be possible to code the logical system conversion using an other REMAP_DATA function, but as the logical system field does have different names you finish by creating a specific import command & Remap script for each table.
    Best regards

     

     

     

    (0) 

Leave a Reply