Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
markmumy
Advisor
Advisor

More and more this year, I am fielding questions about whether or not SAP HANA Cloud, data lake or SAP IQ support schema level backups.  While SAP HANA and SAP HANA Cloud, HANA database support this feature via the EXPORT statement, SAP HANA Cloud, data lake and SAP IQ do not have similar syntax built in to it.

Now, for the good news...  While there is no feature to backup or export a schema, we can use a feature that has been in SAP HANA Cloud, data lake since March 2021 and one that is part of SAP IQ 16.1 SP05 (2H 2021): BACKUP Table

Now, backup table won't solve the problem on its own as it is designed to back up a single object, table, to disk.  However, we can use this feature and use a wrapper procedure that calls it for each object we wish to backup to disk.

 

Consider this stored procedure to use the BACKUP TABLE feature to implement a schema backup.  A few notes, though:

    • This procedure is just sample code showing what is possible

 

    • This procedure will only copy the IQ based tables.  It does not copy views, materialized views, or catalog tables.

 

    • This procedure does not extract the table schema, that must be done via other tools



drop procedure if exists sp_iqschemabackup;



create procedure sp_iqschemabackup(

          in schema_owner varchar(128 )

        , in output_dir varchar(1000)

        , in encryption_key varchar(128) default NULL

        , in cloud_credentials varchar(1000) default NULL

)

begin

        declare tbl_backup varchar(20000);

        declare enc_key varchar(150);



        set enc_key = '';

        if encryption_key is not null

        then

                set enc_key = 'key '''||encryption_key||'''';

        end if;



        for FOR_LOOP as FOR_CURSOR cursor for

                select suser_name( creator) as towner, table_name tname

                from systable

                where lower ( suser_name( creator ) ) = lower( schema_owner )

                and table_type = 'BASE'

                and server_type = 'IQ'

        do

                set tbl_backup= 'backup table '||towner||'.'||tname

                        -- output location

                        ||' to ''' ||output_dir||'/'||towner||'.'||tname||'/'

                        -- output files

                        ||towner||'.'||tname||'.backup'' '

                        -- encyption key

                        ||enc_key||' '

                        -- cloud credentials

                        ||cloud_credentials||';' ;

                message tbl_backup to client;

                execute immediate tbl_backup;

        end for;



        return;

end;


 

This procedure 4 parameters.  It will also run in either SAP IQ 16.1 SP05 (and later) or HANA Cloud, data lake.

    • schema_owner -- This is the owner of the objects you wish to backup.  The name is case insensitive.

 

    • output_dir -- You must specify the output directory for the objects.  Within that directory, the procedure will create a subdirectory, per table, so that it is easier to read a directory listing.

        • The subdirectory will be named table_owner.table_name


 

    • encryption_key -- if you wish to encrypt the backup, specify the key here

        • For HANA Cloud, data lake you MUST specify an encryption key

        • For SAP IQ, an encryption key is optional


 

    • cloud_credentials -- Necessary cloud credentials, if needed, to backup the tables to cloud storage

        • For HANA Cloud, data lake this option is not needed if you backup to HANA Cloud, data lake files.  For Azure, S3, AliCloud, and GCP follow the HANA Cloud, data lake manuals on proper formatting of the connection string.

        • For SAP IQ, this option is not used as you cannot use BACKUP TABLE to cloud storage




As mentioned, this procedure has been parameterized enough to allow for it to run in SAP HANA Cloud, data lake as well as on-premise in SAP IQ.

 

This procedure will not get around and security issues.  For instance, with SAP HANA Cloud, data lake objects created by HANA and the SYSRDL#CG user are not accessible to other users unless granted permissions.

Below are some examples of using this procedure.  All of them would require this code to be run first. This creates a user and 5 tables with just 1 row of data in each.

grant connect to bkup_tbl_test;



drop table if exists bkup_tbl_test.t1;

drop table if exists bkup_tbl_test.t2;

drop table if exists bkup_tbl_test.t3;

drop table if exists bkup_tbl_test.t4;

drop table if exists bkup_tbl_test.t5;



create table bkup_tbl_test.t1 ( a1 int, a2 int );

insert into bkup_tbl_test.t1 values ( 1,1 );

select * into bkup_tbl_test.t2 from bkup_tbl_test.t1;

select * into bkup_tbl_test.t3 from bkup_tbl_test.t1;

select * into bkup_tbl_test.t4 from bkup_tbl_test.t1;

select * into bkup_tbl_test.t5 from bkup_tbl_test.t1;


 

Here is a sample where I run this procedure on SAP HANA Cloud, data lake using for a schema that I created and back it up to the built in cloud file store, HANA Cloud, data lake files:

call sp_iqschemabackup ( 'bkup_tbl_test', 'hdlfs:///tmp/backups', 'encryption key here' );


 

Gives this output.  When the procedure runs, it prints the command to be executed.

backup table bkup_tbl_test.t1 to 'hdlfs:///tmp/backups/bkup_tbl_test.t1/bkup_tbl_test.t1.backup' key 'encryption key here' ;

backup table bkup_tbl_test.t2 to 'hdlfs:///tmp/backups/bkup_tbl_test.t2/bkup_tbl_test.t2.backup' key 'encryption key here' ;

backup table bkup_tbl_test.t3 to 'hdlfs:///tmp/backups/bkup_tbl_test.t3/bkup_tbl_test.t3.backup' key 'encryption key here' ;

backup table bkup_tbl_test.t4 to 'hdlfs:///tmp/backups/bkup_tbl_test.t4/bkup_tbl_test.t4.backup' key 'encryption key here' ;

backup table bkup_tbl_test.t5 to 'hdlfs:///tmp/backups/bkup_tbl_test.t5/bkup_tbl_test.t5.backup' key 'encryption key here' ;


 

Here is a sample where I run this procedure on SAP HANA Cloud, data lake using for a schema that I created and back it up to an Azure ADL-gen2 blob store:

call sp_iqschemabackup ( 'bkup_tbl_test'

    , 'bb://dummydata/backup_test'

    , 'encryption key here'

    , 'CONNECTION_STRING ''DefaultEndpointsProtocol=https;AccountName=*****;AccountKey=******;EndpointSuffix=core.windows.net'' '

);


 

Gives this output.  When the procedure runs, it prints the command to be executed.

backup table bkup_tbl_test.t1 to 'bb://dummydata/backup_test/bkup_tbl_test.t1/bkup_tbl_test.t1.backup' key 'encryption key here' CONNECTION_STRING 'DefaultEndpointsProtocol=https;AccountName=*****;AccountKey=*****;EndpointSuffix=core.windows.net' ;

backup table bkup_tbl_test.t2 to 'bb://dummydata/backup_test/bkup_tbl_test.t2/bkup_tbl_test.t2.backup' key 'encryption key here' CONNECTION_STRING 'DefaultEndpointsProtocol=https;AccountName=*****;AccountKey=*****;EndpointSuffix=core.windows.net' ;

backup table bkup_tbl_test.t3 to 'bb://dummydata/backup_test/bkup_tbl_test.t3/bkup_tbl_test.t3.backup' key 'encryption key here' CONNECTION_STRING 'DefaultEndpointsProtocol=https;AccountName=*****;AccountKey=*****;EndpointSuffix=core.windows.net' ;

backup table bkup_tbl_test.t4 to 'bb://dummydata/backup_test/bkup_tbl_test.t4/bkup_tbl_test.t4.backup' key 'encryption key here' CONNECTION_STRING 'DefaultEndpointsProtocol=https;AccountName=*****;AccountKey=*****;EndpointSuffix=core.windows.net' ;

backup table bkup_tbl_test.t5 to 'bb://dummydata/backup_test/bkup_tbl_test.t5/bkup_tbl_test.t5.backup' key 'encryption key here' CONNECTION_STRING 'DefaultEndpointsProtocol=https;AccountName=*****;AccountKey=*****;EndpointSuffix=core.windows.net' ;


 

When this same procedure is used on SAP IQ, the encryption key is optional and the cloud store connection string is not used:

call sp_iqschemabackup ( 'bkup_tbl_test', '/tmp/backups' );


 

Gives this output.  When the procedure runs, it prints the command to be executed.

backup table bkup_tbl_test.t1 to '/tmp/backups/bkup_tbl_test.t1/bkup_tbl_test.t1.backup';

backup table bkup_tbl_test.t2 to '/tmp/backups/bkup_tbl_test.t2/bkup_tbl_test.t2.backup';

backup table bkup_tbl_test.t3 to '/tmp/backups/bkup_tbl_test.t3/bkup_tbl_test.t3.backup';

backup table bkup_tbl_test.t4 to '/tmp/backups/bkup_tbl_test.t4/bkup_tbl_test.t4.backup';

backup table bkup_tbl_test.t5 to '/tmp/backups/bkup_tbl_test.t5/bkup_tbl_test.t5.backup';


 

I hope you enjoy being able to implement your own schema backup in SAP HANA Cloud, data lake as well as in SAP IQ 16.1 SP05 and later.

 

Do you want to take this to the next level?  Want to convert it to an event so that it just always runs?  Consider this code that simply wraps the above procedure in event code.  This will run the schema backup procedure every 60 minutes at the top of the hour.  The way that the procedure was written above, it will overwrite the previous data.

drop EVENT if exists schema_backup;



CREATE EVENT schema_backup

SCHEDULE START TIME '00:00 AM' EVERY 60 MINUTES

HANDLER

BEGIN

        call sp_iqschemabackup ( 'bkup_tbl_test'

            , 'hdlfs:///tmp/backups'

            , 'encryption key here' );

end;


 

 

 

1 Comment