0

My requirement is to read Huge Json from REST API call and process, i googled a lot to find out a feasible solution, everyone saying use JsonReader bypassing InputStream.I am doing the same but still encountered out of memory.

I am using gson library to get the data, I am not sure how to achieve this functionality, could you please guide me on this. small json response, this is working fine and fast whereas getting huge response stream I am getting out of Memory.

public void processHugeInputStream(InputStream is){
 BufferedReader br = new BufferedReader(new InputStreamReader(is));
 String line;
 List<Sample> list = new ArrayList<Sample>();
 while(line = br.readLine()!=null){
 Reader stringReader = new StringReader(line);
 JSONReader reader = new JSONReader(stringReader);
 reader.setLinent(true);
 Sample object = gson.fromJSON(reader,Sample.class);
 list.add(object);
}
}
3
  • here is my code snippet. public void processHugeInputStream(InputStream is){ BufferedReader br = new BufferedReader(new InputStreamReader(is)); String line; List<Sample> list = new ArrayList<Sample>(); while(line = br.readLine()!=null){ Reader stringReader = new StringReader(line); JSONReader reader = new JSONReader(stringReader); reader.setLinent(true); Sample object = gson.fromJSON(reader,Sample.class); list.add(object); } } small json response, this is working fine and fast whereas getting huge response stream i am getting in to OOM Commented Jun 28, 2020 at 7:54
  • Please show the code you are using to do this already; there may be a simple issue in how you are using JsonReader that’s resulting in the behaviour you see - but we can’t help if we don’t see what you’ve done so far. Commented Jun 28, 2020 at 7:54
  • Hi thanks for the quick reply, Added the code to my post, I am trying all the possible scenarios, but still out of memory exception Commented Jun 28, 2020 at 7:58

2 Answers 2

3

It looks like you are reading the stream line by line, which is ok.

However you are allocating a List and then adding each parsed result into the list, so if you have a huge Json input stream you will end up with a huge list of parsed objects. It’s probably this part that is giving you the memory error.

If you have a way of refactoring the code so that you can process each Json Object as it comes in, rather than adding them all to an array and processing later, then you can run without needing to increase the memory size of the JVM.

One way you can make this generic is to have a Consumer<Sample> function that you pass into this method, and then you can accept each Sample as it comes in.

Sign up to request clarification or add additional context in comments.

9 Comments

Thanks, A lot, could you guide me small code snippet please.And please clarify is reading huge response is a regular use case (i mean its a normal use case ?) i suppose to get 20 Lakh records from this JSON.if you have any code snippet handy that really helps to understand how to stream it properly.
The issue is really: what are you doing with the list of Sample when you have finished parsing the Json file?
My requirement is, i have the data in json of 20 Lakh records, just need to parse it and prepare POJOs then i can insert using JdbcTemplate batch mechanisam.I need your help to read my whole JSON data in to POJO(that Pojo i have already created using the same JSON Strucutre using Spring Plugin) in our case its Sample.java, kindly let me know you need more info to help me on this
Ok so instead of doing {parse all} {process all} see if you can change it to {parse one} {process one} inside your while loop above
that while loop i guess it will read line by line (i mean record by record from JSON ), are we again ending up dump all the JSON data to String here , bit confused but i really appreciated your quick responses though its Sunday :)
|
1

Please increase your heap size in Java. Use this option with your Java command : java -Xmx6g ... to increase to 6 GB

There are also more option to config your memory usage:

  • Thread stack size -Xss128m – set the thread stack size to 128 megabytes.
  • Young generation size -Xmn256m – set the young generation size to 256 megabytes.

Here you can get a full documentation to this topic

1 Comment

Hi Thanks for the reply, I believe increasing size is not optimal solution i feel, though i am using BufferedReader why my Json response loading into memory, could you please point me there

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.