0

I,m trying to export some files from a system and save it in my drive, the problem is that some files are pretty big and I get the java out of memory error.

FileOutputStream fileoutstream = new FileOutputStream(filenameExtension);
fileoutstream.write(dataManagement.getContent(0).getData());
fileoutstream.flush();
fileoutstream.close();

Any recomendation that I can try, I add the flush but now diference, this will call the export method, generate the file and saved. I,m using a cursos to run over the data that I,m exporting not an array, I try to add more memory but the files are too big.

1
  • Read and write the file in chunks.Try periodically flushing the output stream after some XXX amount of data is written . Commented May 1, 2013 at 16:23

3 Answers 3

2

You are loading the whole file in memory before writing it. On the contrary you should:

  1. load only a chunk of data
  2. write it
  3. repeat the steps above until you have processed all data.
Sign up to request clarification or add additional context in comments.

Comments

1

If the files are really big, you may need to read/write them in chunks. If the files are big enough to fit in memory, then you can increase the size of the virtual machine memory.

i.e:

java -Xmx512M ...


FileInputStream fi = infile;
FileOutputStream fo = outfile
byte[] buffer = new byte[5000];
int n;
while((n = fi.read(buffer)) > 0)
    fo.write(b, 0, n);

Hope this helps to get the idea.

1 Comment

I try to add 1500m of memory but it faild, the file is like 300m, any example on how to read and write it in chunks
0

you can use the spring batch framework to do the reading and writing the file in chunk size. http://static.springsource.org/spring-batch/

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.