-1

I have went through many links and similar questions related to java.lang.OutOfMemoryError: Java heap space but none of the solutions resolved my problem. So here is my question, I have a web application where user uploads a Excel file that has records around 2500 my application reads contents of this file and inserts them into Database. but after inserting 700 records I am getting exception as throwable object caught= java.lang.OutOfMemoryError: Java heap space

Same code works if file contains 500 or less than 500 records. following is my JAVA_OPTS & CATALINA_OPTS variable in catalina.bat file

    JAVA_OPTS=-Xmx1536m -XX:MaxPermSize=128m
    CATALINA_OPTS=-Xms512m -Xmx512m

can anyone please tell me what can be done to resolve this issue?

4
  • chek out http://stackoverflow.com/questions/3443937/java-heap-memory-error Commented May 23, 2013 at 9:09
  • 1
    What are you using to read the spreadsheet? Commented May 23, 2013 at 9:10
  • 1
    Definetly you have to rethink about performed operations. Don't try to read all ~2500 records in a one step. Make this process iterative. I mean, read portion of data, insert it into your db. Repeat this process until you reach the end. Commented May 23, 2013 at 9:13
  • @Bob Flannigon : JExcelApi is used for reading Excel file Commented May 23, 2013 at 9:15

4 Answers 4

0

Probably you are loading the entire file in memory. Can you try to convert your Excel file to plain-text CSV and then read it line by line by using the Java class Scanner?

Scanner scanner = new Scanner("yourCsvFile.csv");

while (scanner.hasNextLine()) {
    String line = scanner.nextLine();

    //process your line
}
Sign up to request clarification or add additional context in comments.

Comments

0

You are probably using Hibernate to insert the records. Hibernate maintains a session cache of all saved objects and postpones any actual inserts until transaction commit. You must flush the session cache every 100 records or so. It is also advised to disable the second-level cache while doing a massive insert.

There are several other issues related to batch inserts with Hibernate and the official documentation is mandatory reading for anyone implementing it.

Comments

0

Did you close you File stream?If you didn't close you file stream,the heap size would be very bigger!Then it would throw a OutOfMemoryError exception.So,check that if you close the file stream!

Comments

0

can anyone please tell me what can be done to resolve this issue?

It sounds to me like you have a memory leak (i.e. a bug) in your application. Without seeing your code, we can't tell you where it is. However the answers to these questions explain how to find memory leaks in Java programs:


I should point out that you have specified two different values for the heap size in the JAVA_OPTS and CATALINA_OPTS variables. Only one of this is going to take effect, but which one depends on the details of the script that is used to start Tomcat.

However, a properly written application shouldn't run out of memory while uploading data from a spreadsheet. A heap size of 512Mb should be plenty.

1 Comment

thanks for your answer,can you tell me what changes should I make in order to make heap size 512Mb? because my tomcat manager always shows Max memory: 247.50 MB for JVM

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.