1

I have recently taken data dumps from an Oracle database. Many of them are large in size(~5GB). I am trying to insert the dumped data into another Oracle database by executing the following SQL in SQL Developer

@C:\path\to\table_dump1.sql;
@C:\path\to\table_dump2.sql;
@C:\path\to\table_dump3.sql;
             :

but it is taking a long time like more than a day to complete even a single SQL file.
Is there any better way to get this done faster?

1
  • What kind of dump do you have? are these files a list of insert statements or anything else? Commented Oct 6, 2016 at 7:26

1 Answer 1

2

SQL*Loader is my favorite way to bulk load large data volumes into Oracle. Use the direct path insert option for max speed but understand impacts of direct-path loads (for example, all data is inserted past the high water mark, which is fine if you truncate your table). It even has a tolerance for bad rows, so if your data has "some" mistakes it can still work.

SQL*Loader can suspend indexes and build them all at the end, which makes bulk inserting very fast.

Example of a SQL*Loader call:

$SQLDIR/sqlldr /@MyDatabase direct=false silent=feedback \
    control=mydata.ctl log=/apps/logs/mydata.log bad=/apps/logs/mydata.bad \
    rows=200000

And the mydata.ctl would look something like this:

LOAD DATA
INFILE '/apps/load_files/mytable.dat'
INTO TABLE my_schema.my_able
FIELDS TERMINATED BY "|"
 (ORDER_ID,
  ORDER_DATE,
  PART_NUMBER,
  QUANTITY)

Alternatively... if you are just copying the entire contents of one table to another, across databases, you can do this if your DBA sets up a DBlink (a 30 second process), presupposing your DB is set up with the redo space to accomplish this.

truncate table my_schema.my_table;

insert into my_schema.my_table
select * from my_schema.my_table@my_remote_db;

The use of the /* +append */ hint can still make use of direct path insert.

Sign up to request clarification or add additional context in comments.

4 Comments

My dump files are just millions of lines of insert SQL statements. Can I still use SQL Loader to load the data?
No, not exactly... instead of insert statements, you want to change the files to be either fixed-width or delimited files. What is the origin of the data? My guess would be it's easier to extract it as a data file than a series of SQL statements... If not, you could script the SQL into a data file (using Perl, for example)
All I have are files that contains millions of insert statements. I guess I should convert them into csv format or something as you suggested.
Where did these SQL inserts come from? It seems to me if you hijack that source, you can change the output to whatever you want

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.