SAP/Sybase’s documentation isn’t very clear for new IQ dbas and developers. Once such item is simply extracting data from the warehouse into files that can be loaded into another IQ system when load table from location simply isn’t available or isn’t practical.
Assumptions:
The simplest method is to simply create a shell script per table to export:
#!/bin/env bash
if [[ ! -d data_export ]]; then
mkdir data_export
fi
set -A TABLES
TABLES=$( echo "set nocount on
go
select convert(varchar(80), user_name(creator) + '.' + table_name) from systable where user_name(creator) in ('JF') and table_type = 'BASE'
go
exit" | $ISQL -h -b | sed -e '/affected/d' -e '/---/d' -e '/^$/d' )
for table in ${TABLES}; do
echo
echo "Generating /dba/backup/sybbackup/bkp/export/data_export/${table}_export.sql"
sql_file=/dba/backup/sybbackup/bkp/export/data_export/${table}_export.sql
echo "set temporary option Temp_Extract_Name1 = '/dba/backup/sybbackup/bkp/export/data_export/${table}.data'
set temporary option Temp_Extract_Name2 = ''
set temporary option Temp_Extract_Binary = 'on'
set temporary option Temp_Extract_Swap = 'off'
set temporary option TEMP_EXTRACT_SIZE1 = 1073741824
go
select * from ${table}
go
set temporary option Temp_Extract_name1 = ''
go" > $sql_file
done
Now that we have the script files, let’s extract the data by running the shell scripts and compress each exported file with the gzip program. You can cancel the export at any time with ctrl-c and restart it after the last exported table:
for TABLE in *.sql; do
datafile=$( echo $TABLE | sed -e 's/_export.sql$/.data/' )
echo $datafile
gzipfile=${datafile}.gz
if [ -f $gzipfile ]; then
echo "$gzipfile already exists"
else
$ISQL -i $TABLE 2>&1 > $TABLE.out
gzip -1 $datafile
fi
done
Now that we have that data exported, imagine that you copied the files to another system. How do you import that data assuming that the tables have already been completed? Easy. We will create a set of import script files.
LOCAL_PATH=/dba/backup/sybbackup/bkp/import/data_import
for TABLE_gzip in *.gz; do
datafile=$( echo $TABLE_gzip | sed -e 's/.gz$//' )
TABLE_FILE=$( echo $TABLE_gzip | sed -e 's/.data.gz$//' )
TABLE_OWNER=$( echo $TABLE_FILE | cut -d . -f1 )
TABLE=$( echo $TABLE_FILE | cut -d . -f2 | sed -e 's/_export$//' )
if [ -f ${datafile}.done ]; then
echo "${datafile} already processed"
else
# ===================================================================
# Generate the load commend to load the file
# ===================================================================
echo "#!/bin/env bash" > ${TABLE_OWNER}.${TABLE}_import.sh
echo ". /dba/code/syb/.setenv" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo "" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo "echo \"uncompressing $TABLE_gzip\"" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo "gzip -dc $TABLE_gzip > $datafile" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo "">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "echo \"importing ${TABLE_OWNER}.${TABLE}\"" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo '$ISQL -b < <EOF'>> ${TABLE_OWNER}.${TABLE}_import.sh
echo "select 'Start datetime'=convert(char(25), getdate(), 119), 'TABLENAME=${TABLE_OWNER}.${TABLE}'">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo " ">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "truncate table ${TABLE_OWNER}.${TABLE}">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "commit">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "SET TEMPORARY OPTION IDENTITY_INSERT = '${TABLE_OWNER}.${TABLE}'">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "load table ${TABLE_OWNER}.${TABLE}">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "( ">> ${TABLE_OWNER}.${TABLE}_import.sh
../gen_iq_col_list_w_null_byte.sh ip00 $TABLE_OWNER $TABLE | sed -e '/row affected/d;s/ *$//;/^$/d'>> ${TABLE_OWNER}.${TABLE}_import.sh
echo ")">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "from '${LOCAL_PATH}/${datafile}'">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "escapes off format binary">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "commit">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo " ">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "select 'Start datetime'=convert(char(25), getdate(), 119), 'TABLENAME=${TABLE_OWNER}.${TABLE}'">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "SET TEMPORARY OPTION IDENTITY_INSERT = ''">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "go">> ${TABLE_OWNER}.${TABLE}_import.sh
echo " ">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "EOF">> ${TABLE_OWNER}.${TABLE}_import.sh
echo "" >> ${TABLE_OWNER}.${TABLE}_import.sh
echo "rm -f $datafile">> ${TABLE_OWNER}.${TABLE}_import.sh
echo " ">> ${TABLE_OWNER}.${TABLE}_import.sh
chmod u+x ${TABLE_OWNER}.${TABLE}_import.sh
fi
done
If the target system is a different endian (Linux x86-64 -> AIX), replace
echo "escapes off format binary">> ${TABLE_OWNER}.${TABLE}_import.sh
with
echo "escapes off format binary byte order high">> ${TABLE_OWNER}.${TABLE}_import.sh
We simply need to run each import script file:
for import_file in *_import.sh ; do ./$import_file 2>&1 |tee ${j}.out ; done
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
37 | |
10 | |
5 | |
4 | |
4 | |
3 | |
3 | |
3 | |
2 | |
2 |