We have large data in csv file. It has 2.5 million rows and each row has 10 fields and we are trying to prepare hashmaps for each row and then adding that hashmap to arraylist.
I am not able to do this because of huge data its throwing out of memory Java Heap space error.
But my application needs list of hashmap (I don’t want increasing heapspace).
reader = new CSVReader(new FileReader(dataFile),',');
Map<String, String> feedMap = null;
String[] firstLine;
String[] nextLine;
String mappingKey = null;
String mappingValue = null;
//Read one line at a time
firstLine = reader.readNext();
while ((nextLine = reader.readNext()) != null){
int i = 0;
feedMap = new HashMap<String, String>();
for(String token : nextLine){
mappingKey = xmlNodeMap.get(firstLine[i]);
if (mappingKey != null) {
mappingValue = token.trim().length() > 0 ? token : Constants.NO_VALUE;
feedMap.put(mappingKey, mappingValue);
}
i++;
}
listOfMaps.add(feedMap);
}