I have a huge table that I'm trying to stream out to a file. However it seems that no matter what I try, Postgres is trying to pull all the data at once and running out of memory. I've read many posts here and elsewhere and I think I'm doing this "right", so why do I keep running out of memory?
Here's my code:
Connection conn = DriverManager.getConnection(dbUrl, dbUser, dbPassword);
conn.setAutoCommit(false);
conn.setReadOnly(true);
Statement ps = conn.createStatement(
ResultSet.TYPE_FORWARD_ONLY,
ResultSet.CONCUR_READ_ONLY,
ResultSet.HOLD_CURSORS_OVER_COMMIT);
ps.setFetchSize(10);
String sql = "SELECT * FROM BIGTABLE "
+ "WHERE '20150401' BETWEEN startdate AND enddate";
ResultSet rs = ps.executeQuery(sql);
writeResultSet(os, rs);
The code never gets past the query execution before running out of memory.
In case it matters, this is happening in a separate thread while streaming to an open ZipOutputStream. At this point, one item has already been streamed out and I never get to the point where I can stream out this table.
I'm working with Postgres 9.3.5, and currently limiting the VM to 128MB for testing. While I could increase the heap size, I still think I shouldn't be running into this issue.
[jsyk, I've stripped out try/catch blocks and the like for clarity.]