2

I have been designing a Java server, connected via PHP, that accepts a series of protein chains and performs computations on each of them. The computations are handled by external Perl scripts which return data to Java which is then inserted into a MySQL database.

Java successfully executes the Perl scripts and returns the data, the problem lies in MySQL inserting the data. Java seems to be throwing an OutofMemory exception (running out of heap space). After Googling around most solutions seem to involve "Increase the heap size", the problem with this is for 10 proteins chains increasing the heap size to 500MB seems to solve the problem but if 20 chains are entered again the heap size will have to increase and 1000 chains (what the system should be able to handle) increasing the heap size is not an option.

My question is this.

  • What is a MySQL INSERT doing which is causing the heap to fill, and why is the JVM not cleaning up the excess data? The MySQL query is not a prepared statement and is just a simple INSERT and executeUpdate. We are using the driver mysql-connector-java-5.1.13-bin.
4
  • 2
    Your code seems to leak memory. Please post some code. Commented Jul 16, 2011 at 0:10
  • What do you see when you use a memory profiler like VisualVM? Commented Jul 16, 2011 at 0:13
  • It's probably your code, not the insert. Commented Jul 16, 2011 at 0:13
  • did you close all your Resulsets, PreparedStatements and in finally block, because these Object are never GC'ed and stays in usedMemory untill JVM instance exists Commented Jul 16, 2011 at 0:56

2 Answers 2

3

Enable the -XX:+HeapDumpOnOutOfMemoryError option and when the server runs out of memory have a look at the .hprof file with a tool like Eclipse MAT. MAT will tell you who is using the memory, and it will try to figure out leaks as well.

Sign up to request clarification or add additional context in comments.

2 Comments

That disables it. Perhaps you mean -XX:+HeapDumpOnOutOfMemoryError ;)
The options are often listed with their default +/- values. ;)
0

It wouldn't be MySQL that is causing the memory blowout in the JVM - only in-JVM operations can do that.

Your code is probably done "block wise", and looks like this pseudo code:

List<ProteinResult> results = new ArrayList<ProteinResult>();
for (int i = 0; i < 1000; i++)
    results.add(callPerl(proteins[i]));

for (ProteinResult pr : results)
    db.execute("insert into ...", results);

If so, turn it into a "stream style", which does one thing at a time and doesn't hold onto objects longer than it has to, and is this totally scaleable:

for (int i = 0; i < 1000; i++)
    db.execute("insert into ...", callPerl(proteins[i])));

If reviewing the code at face value doesn't yield the answer, you can find out what objects are chewing up memory by using a JVM profiler, which hooks into the running JVM and can tell you things like what how many of objects are being created per second and what class they are etc. Your memory leak should leave plenty of clues for a profiler to find.

Here's one you can check out: JProfiler

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.