Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
- 1) SQL Loader:
- Place the flat file( .txt or .csv) on the desired Location.
- Create a control file
Infile “Mytextfile.txt” (– file containing table data , specify paths correctly, it could be .csv as well)
Append or Truncate (– based on requirement) into oracle tablename
Separated by “,” (or the delimiter we use in input file) optionally enclosed by
(Field1, field2, field3 etc)
- Now run sqlldr utility of oracle on sql command prompt as
sqlldr username/password .CTL filename
- The data can be verified by selecting the data from the table.
Select * from oracle_table;
- 1) External Table:
- Place the flat file (.txt or .csv) on the desired location.
b. Create a directory
create or replace directory ext_dir as ‘/home/rene/ext_dir’; — path where the source file is kept
- c. After granting appropriate permissions to the user, we can create external table like below.
create table ext_table_csv (
organization external (
default directory ext_dir
access parameters (
records delimited by newline
fields terminated by ','
missing field values are null
reject limit unlimited;
- d. Verify data by selecting it from the external table now
select * from ext_table_csv;
External tables feature is a complement to existing SQL*Loader functionality.
It allows you to –
- Access data in external sources as if it were in a table in the database.
- Merge a flat file with an existing table in one statement.
- Sort a flat file on the way into a table you want compressed nicely
- Do a parallel direct path load — without splitting up the input file, writing
- External tables are read-only.
- No data manipulation language (DML) operations or index creation is allowed on an external table.
Using Sql Loader You can –
- Load the data from a stored procedure or trigger (insert is not sqlldr)
- Do multi-table inserts
- Flow the data through a pipelined plsql function for cleansing/transformation
Comparison for data loading
To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.