Skip to Content

SAP/Sybase’s documentationSybase SAP isn’t very clear for new IQ dbas and developers. Once such item is simply extracting data from the warehouse into files that can be loaded into another IQ system when load table from location simply isn’t available or isn’t practical.



Assumptions:

  1. Extraction of table data with the owner of ‘JF’. Replace with the schemas you desire.
  2. Exported data will go to the /dba/backup/sybbackup/bkp/export/data_export directory.
  3. We will use isql with the parameters to connect stored in the $ISQL environment variable.
  4. The max temporary table size is 1TB. Increase as needed.

The simplest method is to simply create a shell script per table to export:

#!/bin/env bash

if [[ ! -d data_export ]]; then
mkdir data_export
fi

set -A TABLES

TABLES=$( echo “set nocount on
go
select convert(varchar(80), user_name(creator) + ‘.’ + table_name) from systable where user_name(creator) in (‘JF’) and table_type = ‘BASE’
go
exit”
| $ISQL -h -b  | sed -e ‘/affected/d’ -e ‘/—/d’ -e ‘/^$/d’ )

for table in ${TABLES}; do
echo
echo
“Generating /dba/backup/sybbackup/bkp/export/data_export/${table}_export.sql”
sql_file=
/dba/backup/sybbackup/bkp/export/data_export/${table}_export.sql

echo “set temporary option Temp_Extract_Name1 = ‘/dba/backup/sybbackup/bkp/export/data_export/${table}.data’
set temporary option Temp_Extract_Name2 = ”
set temporary option Temp_Extract_Binary = ‘on’
set temporary option Temp_Extract_Swap = ‘off’
set temporary option TEMP_EXTRACT_SIZE1 = 1073741824
go

select * from ${table}
go

set temporary option Temp_Extract_name1 = ”
go”
> $sql_file
done

Now that we have the script files, let’s extract the data by running the shell scripts and compress each exported file with the gzip program. You can cancel the export at any time with ctrl-c and restart it after the last exported table:

for TABLE in *.sql; do
datafile=$( echo $TABLE | sed -e
‘s/_export.sql$/.data/’ )
echo $datafile
gzipfile=${datafile}.gz

if [ -f $gzipfile ]; then
echo
“$gzipfile already exists”
else
$ISQL -i $TABLE
2>&1 > $TABLE.out
gzip –
1 $datafile
fi
done

Now that we have that data exported, imagine that you copied the files to another system. How do you import that data assuming that the tables have already been completed? Easy. We will create a set of import script files.

LOCAL_PATH=/dba/backup/sybbackup/bkp/import/data_import

for TABLE_gzip in *.gz; do
datafile=$( echo $TABLE_gzip | sed -e
‘s/.gz$//’ )
TABLE_FILE=$( echo $TABLE_gzip | sed -e
‘s/.data.gz$//’ )
TABLE_OWNER=$( echo $TABLE_FILE | cut -d . -f1 )
TABLE=$( echo $TABLE_FILE | cut -d . -f2 | sed -e
‘s/_export$//’ )

if [ -f ${datafile}.done ]; then
echo
“${datafile} already processed”
else
# ===================================================================
# Generate the load commend to load the file
# ===================================================================
echo
“#!/bin/env bash” > ${TABLE_OWNER}.${TABLE}_import.sh
echo
“. /dba/code/syb/.setenv” >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“” >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“echo \”uncompressing $TABLE_gzip\”” >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“gzip -dc $TABLE_gzip > $datafile” >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“echo \”importing ${TABLE_OWNER}.${TABLE}\”” >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
‘$ISQL -b < <EOF’>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“select ‘Start datetime’=convert(char(25), getdate(), 119), ‘TABLENAME=${TABLE_OWNER}.${TABLE}'”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“go”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
” “>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“truncate table ${TABLE_OWNER}.${TABLE}”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“go”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“commit”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“go”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“SET TEMPORARY OPTION IDENTITY_INSERT = ‘${TABLE_OWNER}.${TABLE}'”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“go”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“load table ${TABLE_OWNER}.${TABLE}”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“( “>> ${TABLE_OWNER}.${TABLE}_import.sh
../gen_iq_col_list_w_null_byte.sh ip00 $TABLE_OWNER $TABLE | sed -e
‘/row affected/d;s/ *$//;/^$/d’>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“)”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“from ‘${LOCAL_PATH}/${datafile}'”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“escapes off format binary”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“go”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“commit”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“go”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
” “>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“select ‘Start datetime’=convert(char(25), getdate(), 119), ‘TABLENAME=${TABLE_OWNER}.${TABLE}'”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“go”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“SET TEMPORARY OPTION IDENTITY_INSERT = ””>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“go”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
” “>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“EOF”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“” >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
“rm -f $datafile”>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
” “>> ${TABLE_OWNER}.${TABLE}_import.sh
chmod u+x ${TABLE_OWNER}.${TABLE}_import.sh
fi
done

If the target system is a different endian (Linux x86-64 -> AIX), replace

echo “escapes off format binary”>> ${TABLE_OWNER}.${TABLE}_import.sh

with

echo “escapes off format binary byte order high”>> ${TABLE_OWNER}.${TABLE}_import.sh

We simply need to run each import script file:

for import_file in *_import.sh ; do ./$import_file 2>&1 |tee  ${j}.out ; done


SAP Sybase IQ: Easily Extracting and Importing Data With Your IQ Data WarehouseJason L. Froebe – Tech tips and How Tos …

To report this post you need to login first.

6 Comments

You must be Logged on to comment or reply to a post.

  1. Fred Verheul

    Hi Jason,

    You may not know it (not many people do, it seems), but this kind of ‘short announcements with link’ content really shouldn’t be published using the blog format. The proper way of publishing content like this is by opening a discussion thread and unmarking it as a question. A more complete explanation can be found here.

    Thank you in advance for adapting to that format and using SCN in the way it is intended.

    Best regards, Fred

    (0) 
          1. Rey Wang

            This unload/extract works fine with IQ 15.4.

            IQ 16, varchar column or N-bit store does not have trailing space padding.

            Try with this table, test_v1:

            Column      Type        Nullable Primary Key

            ——————————————–

            schema      varchar(30) 0        0

            object_name varchar(50) 0        0

            object_type varchar(10) 0        0

            env         varchar(10) 0        0

            Data can be loaded back; but lots of gibberish data.

            (0) 
  2. Christine Zhong

    Jason,

    Are you interested to write a wiki page for this post ? I would like to create a central location for IQ administration script collection in a SAP IQ wiki space.

    Christine

    (0) 

Leave a Reply