Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member

SAP/Sybase’s documentationSybase SAP isn’t very clear for new IQ dbas and developers. Once such item is simply extracting data from the warehouse into files that can be loaded into another IQ system when load table from location simply isn’t available or isn’t practical.



Assumptions:

  1. Extraction of table data with the owner of ‘JF’. Replace with the schemas you desire.
  2. Exported data will go to the /dba/backup/sybbackup/bkp/export/data_export directory.
  3. We will use isql with the parameters to connect stored in the $ISQL environment variable.
  4. The max temporary table size is 1TB. Increase as needed.

The simplest method is to simply create a shell script per table to export:

#!/bin/env bash

if [[ ! -d data_export ]]; then
mkdir data_export
fi

set -A TABLES

TABLES=$( echo "set nocount on
go
select convert(varchar(80), user_name(creator) + '.' + table_name) from systable where user_name(creator) in ('JF') and table_type = 'BASE'
go
exit"
| $ISQL -h -b  | sed -e '/affected/d' -e '/---/d' -e '/^$/d' )

for table in ${TABLES}; do
echo
echo
"Generating /dba/backup/sybbackup/bkp/export/data_export/${table}_export.sql"
sql_file=
/dba/backup/sybbackup/bkp/export/data_export/${table}_export.sql

echo "set temporary option Temp_Extract_Name1 = '/dba/backup/sybbackup/bkp/export/data_export/${table}.data'
set temporary option Temp_Extract_Name2 = ''
set temporary option Temp_Extract_Binary = 'on'
set temporary option Temp_Extract_Swap = 'off'
set temporary option TEMP_EXTRACT_SIZE1 = 1073741824
go

select * from ${table}
go

set temporary option Temp_Extract_name1 = ''
go"
> $sql_file
done

Now that we have the script files, let’s extract the data by running the shell scripts and compress each exported file with the gzip program. You can cancel the export at any time with ctrl-c and restart it after the last exported table:

for TABLE in *.sql; do
datafile=$( echo $TABLE | sed -e
's/_export.sql$/.data/' )
echo $datafile
gzipfile=${datafile}.gz

if [ -f $gzipfile ]; then
echo
"$gzipfile already exists"
else
$ISQL -i $TABLE
2>&1 > $TABLE.out
gzip -
1 $datafile
fi
done

Now that we have that data exported, imagine that you copied the files to another system. How do you import that data assuming that the tables have already been completed? Easy. We will create a set of import script files.

LOCAL_PATH=/dba/backup/sybbackup/bkp/import/data_import

for TABLE_gzip in *.gz; do
datafile=$( echo $TABLE_gzip | sed -e
's/.gz$//' )
TABLE_FILE=$( echo $TABLE_gzip | sed -e
's/.data.gz$//' )
TABLE_OWNER=$( echo $TABLE_FILE | cut -d . -f1 )
TABLE=$( echo $TABLE_FILE | cut -d . -f2 | sed -e
's/_export$//' )

if [ -f ${datafile}.done ]; then
echo
"${datafile} already processed"
else
# ===================================================================
# Generate the load commend to load the file
# ===================================================================
echo
"#!/bin/env bash" > ${TABLE_OWNER}.${TABLE}_import.sh
echo
". /dba/code/syb/.setenv" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"echo \"uncompressing $TABLE_gzip\"" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"gzip -dc $TABLE_gzip > $datafile" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"echo \"importing ${TABLE_OWNER}.${TABLE}\"" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
'$ISQL -b < <EOF'>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"select 'Start datetime'=convert(char(25), getdate(), 119), 'TABLENAME=${TABLE_OWNER}.${TABLE}'">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
" ">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"truncate table ${TABLE_OWNER}.${TABLE}">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"commit">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"SET TEMPORARY OPTION IDENTITY_INSERT = '${TABLE_OWNER}.${TABLE}'">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"load table ${TABLE_OWNER}.${TABLE}">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"( ">> ${TABLE_OWNER}.${TABLE}_import.sh
../gen_iq_col_list_w_null_byte.sh ip00 $TABLE_OWNER $TABLE | sed -e
'/row affected/d;s/ *$//;/^$/d'>> ${TABLE_OWNER}.${TABLE}_import.sh
echo
")">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"from '${LOCAL_PATH}/${datafile}'">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"escapes off format binary">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"commit">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
" ">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"select 'Start datetime'=convert(char(25), getdate(), 119), 'TABLENAME=${TABLE_OWNER}.${TABLE}'">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"SET TEMPORARY OPTION IDENTITY_INSERT = ''">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
" ">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"EOF">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo
"rm -f $datafile">> ${TABLE_OWNER}.${TABLE}_import.sh
echo
" ">> ${TABLE_OWNER}.${TABLE}_import.sh
chmod u+x ${TABLE_OWNER}.${TABLE}_import.sh
fi
done

If the target system is a different endian (Linux x86-64 -> AIX), replace

echo "escapes off format binary">> ${TABLE_OWNER}.${TABLE}_import.sh

with

echo "escapes off format binary byte order high">> ${TABLE_OWNER}.${TABLE}_import.sh

We simply need to run each import script file:

for import_file in *_import.sh ; do ./$import_file 2>&1 |tee  ${j}.out ; done


SAP Sybase IQ: Easily Extracting and Importing Data With Your IQ Data WarehouseJason L. Froebe - Tec...

6 Comments
Labels in this area