0

I've written a simple code that reads a table from oracle DB.

I try to run in on a very big table and I see that it consumes a huge amount of memory.

I thought that using fetchsize will cause it to optimize memory usage (that what happens when using it on SQLSERVER), but it didn't. tried it with various values - from 10 to 100000.

Can't see how I manage to perform a simple task - export a very big oracle table to a csv file.

I use ojdbc6.jar as a driver.

also I use

connection.setAutoCommit(false);

Any idea?

1
  • 4
    Show us your Java code. I bet you are storing each row somewhere. It is certainly possible to run an export from a really big table without running out of memory (I'm doing that all the time) Commented Jun 11, 2013 at 15:26

1 Answer 1

1

Seems like creating the statement with ResultSet.TYPE_FORWARD_ONLY solved this problem.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.