Scenario:
Consider a scenario where we have to use the same teradata table as Source and Target in a single dataflow.
This amy sometimes causes a table lock in teradata database and the job will suspend without showing any progress.
Here the table bods_region is used as source & target which cause the job to suspend.
Resolution:
To avoid this issue, we can divide the main dataflow execution to sub dataflows. This can be achieved by adding a data transfer transform in the dataflow.
Here the Data_transfer (DT_Test table) transform added will divide the execution into multiple sub dataflows (which can be viewed in ‘Optimized SQL’ as in below)
This resolves the teradata table lock issue as after the first sub dataflow the lock on bods_region table will be released and so the 2nd sub dataflow will be able to load data to target successfully.
This resolution can be applied for all the scenarios wherever a lock happens for simultaneous read/write .
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Subject | Kudos |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
User | Count |
---|---|
8 | |
5 | |
5 | |
4 | |
4 | |
4 | |
4 | |
4 | |
3 | |
3 |