Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
INTRODUCTION:

In most of times, while migrating data between systems in Interactive SQL (Sybase) system, you may find alterations or mismatch in data type definitions for some columns when you to try to import that table in Data Services for using in a Dataflow. In this Blog post, we will have a walkthrough on what to be done before using those tables in a Dataflow.

 

MAIN ISSUE:

In many occasions, we may need to Import/ Export tables between systems in ISQL. We need to note that data migration in such cases, the metadata and the Data type definitions seems to be different after we import those tables into Data Services. On frequent scenarios, we may find the data type – VARCHAR of length greater 120 would have been taken as the Data type – LONG and not VARCHAR in Data Services. In this Blog post, we will have a look on this issue from the Import/Export Wizard in ISQL and process to be followed in Data Services before actually using those tables in data flows.

 

SOLUTION:

At first, we need to import/export a table in ISQL using the Import/Export Wizard. You need to run your query in ISQL to extract the content to be exported to another database. After successful completion of the query, click the Data tab in the menu and click ‘’Export’’ option.

The Export Wizard screen will pop-up with various export options as shown in the below screenshot.


 

In the Export Wizard screen, check on the Export to a Database icon and click ‘’Next’’. In the next screen, you can enter the credentials of the database to which the table needs to be exported and click ‘’Next’’.

In the next screen, if you want to overwrite an existing table, click on the ‘’Overwrite an Existing table’’ radio button, click on the Database name from the dropdown and select the table and click ‘’Export’’. If you want to create a new table in the database, click on the “Create a new table” radio button and enter the database name, table name and the schema owner and click ‘’Export’’ as shown below.


 

Once the exports of all the tables have been completed successfully, we need to import them into our Data store in Data Services. If you have created a new table in Export Wizard, login to Data Services system (same as our target system in Export Wizard) and right click on your Data store and click ‘’Import by Name’’ as shown below:


 

Next, enter the schema name and the exact Table name and click “Import”. Note that if you have opted to overwrite an existing table, your table that already exists in the Data store would have been overwritten with the exported data. For best practice, always truncate the existing table before exporting such data in ISQL.

Now, create a dataflow and drag and drop the table as either source or target and on viewing the data you can find that the data type greater than VARCHAR 120 is taken as LONG and not as VARCHAR, eventually losing all the data in corresponding fields as shown in the screenshot below.


 

Note that, on trying to use CAST function in the mapping section of the field or trying to manually change the Data type in Query transform will end in errors. To overcome this issue, double click on your exported table name from the list in the Data store. In the pop-up with metadata details, you can manually edit and change the fields with LONG as VARCHAR (X) with “X” being the length of your data type as shown below.


This will change the metadata of your table before you use your table in any of the data flows as shown in the screenshot below.


 

CONCLUSION:

Thus, by this method you can manually edit/change the metadata in the table architecture thereby getting the complete dataset as desired. Hope from this Blog post, you would have got an idea on how to setup metadata for an imported or exported table in ISQL. Feel free to share your views or ideas if you have any workarounds for this issue and share your thoughts.
Labels in this area