0

Im currently experimenting on storing large files on a MySQL 5.5 database using java. My main class is called FileDatabaseTest. It has the following method:

import java.sql.*;
import java.io.*;

...

public class FileDatabaseTest {

...

private void uploadToDatabase(File file, String description) {
        try {
            PreparedStatement stmt = connection.prepareStatement(
                "INSERT INTO FILES (FILENAME, FILESIZE, FILEDESCRIPTION, FILEDATA) " +
                    "VALUES (?, ?, ?, ?)");
            stmt.setString(1, file.getName());
            stmt.setLong(2, file.length());
            stmt.setString(3, description);
            stmt.setBinaryStream(4, new FileInputStream(file));
            stmt.executeUpdate();
            updateFileList();
            stmt.close();
        } catch(SQLException e) {
            e.printStackTrace();
        } catch(FileNotFoundException e) {//thrown by FileInputStream constructor
            e.printStackTrace();
        } catch(SecurityException e) { //thrown by FileInputStream constructor
            e.printStackTrace();
        }
    }

...

}

The database has only one Table - the "FILES" table, and it has the following columns.

ID - AUTOINCREMENT, PRIMARY KEY

FILENAME - VARCHAR(100)

FILESIZE - BIGINT

FILEDESCRIPTION - VARCHAR(500)

FILEDATA - LONGBLOB

The program is working fine when uploading small documents, but when I upload files like 20MB, the upload process is very slow. So I tried putting the FileInputStream inside a BufferedInputStream in the following code:

stmt.setBinaryStream(4, new BufferedInputStream(new FileInputStream(file));

The upload process became very fast. Its like just copying the file to another directory. But when I tried to upload files more than 400mb, I got the following error:

Exception in thread "Thread-5" java.lang.OutOfMemoryError: Java heap space
    at com.mysql.jdbc.Buffer.ensureCapacity(Buffer.java:156)
    at com.mysql.jdbc.Buffer.writeBytesNoNull(Buffer.java:514)
    at com.mysql.jdbc.PreparedStatement.escapeblockFast(PreparedStatement.java:1169)
    at com.mysql.jdbc.PreparedStatement.streamToBytes(PreparedStatement.java:5064)
    at com.mysql.jdbc.PreparedStatement.fillSendPacket(PreparedStatement.java:2560)
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2401)
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2345)
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2330)
    at FileDatabaseTest$2.run(FileDatabaseTest.java:312)
    at java.lang.Thread.run(Thread.java:662)

So I tried using an embedded Apache-Derby database instead of MySQL, and I didn't get the error. I was able to upload 500MB to 1.5G files in the Derby database using the BufferedInputStream. I also observed that when using the BufferedInputStream with the MySQL server in uploading large files, the JVM is eating a lot of memory, while when I used it in the Derby database, the JVM's memory usage is maintaned at around 85MB TO 100MB.

I am relatively new to MySQL and I am just using its default configurations. The only thing I changed in its configuration is the "max_allowed_packet" size so I can upload up to 2GB file to the database. So I wonder where the error came from. Is it a bug of MySQL or the MySQL connector/J? or is there something wrong with my code?

What I am trying to achieve here is to be able to upload large files (up to 2GB) to the MySQL server using java, without increasing the java heap space.

3
  • Mysql jdbc driver has an option called "maxAllowedPacket" which by default matches the value of max_allowed_packet. So it sounds like the driver is internally allocating enough buffer space for the entire packet. Commented Jan 23, 2012 at 10:59
  • I already tried changing the "maxAllowedPacket" property, but when I change it to a smaller value, I get an error when sending files larger than the value of "maxAllowedProperty". When I change it to a higher value, I get the java.lang.outOfMemory error when uploading large files. What I observed is that with MySQL, java reads the contents of the whole file into memory before executing the sql statement, and I think that is the cause of the outOfMemory error when uploading large files. Is there a way to prevent java from loading the whole file into the memory before sending it to MySQL? Commented Jan 24, 2012 at 3:17
  • Did you solved the problem? Commented Apr 21, 2016 at 7:18

4 Answers 4

1

There are another resolve method, if you don't want to upping your JVM heap size:

First, your MySQL version should newer than 5.0.

Second, Statement.getResultSetType() should be TYPE_FORWARD_ONLY and ResultSetConcurrency should be CONCUR_READ_ONLY(default).

Third, include ONE of these lines: 1).statement.setFetchSize(Integer.MIN_VALUE); 2).((com.mysql.jdbc.Statement)stat).enableStreamingResults();

now you will fetch result rows one by one

Sign up to request clarification or add additional context in comments.

Comments

0

Just for the heck of it, try upping your JVM heap size.

increase the java heap size permanently? http://javahowto.blogspot.com/2006/06/6-common-errors-in-setting-java-heap.html

1 Comment

I already tried increasing the JVM heap size up to 1GB, and i still get the same error when I send files over 400mb.
0

upping JVM heap size when running your java code:

right click your java file
    ->run as->run configurations->arguments->VM arguments

Comments

0

It seems more to be a MySQL JDBC problem. Of course you migth consider a GZip + Piped I/O.

I also found a terrible solution, doing the insert in parts:

UPDATE FILES SET FILEDATA = CONCAT(FILEDATA, ?)

We may conclude, that for large files, it is better to store it on disk.

Nevertheless:

final int SIZE = 1024*128;
InputStream in = new BufferedInputStream(new FileInputStream(file), SIZE);
stmt.setBinaryStream(4, in);
stmt.executeUpdate();
updateFileList();
stmt.close();
in.close(); //?

The default buffer size is 8 KB I think, a larger buffer might show a different memory behaviour, maybe shedding some light on the problem.

Closing oneself should not hurt to try.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.