There is a global HashMap in my application which contains custom object as a key. A third party will call this as API and they will create a new custom object and if it is not already there in the HashMap they will put it into the map. How to manage the OutOfMemoryError in this case as HashMap size goes on increasing and after deployment it is tedious to increase the JVM memory.
-
1The thing is, should you really be using HashMap to store gigabytes of data?NiVeR– NiVeR2018-06-27 07:12:54 +00:00Commented Jun 27, 2018 at 7:12
-
Following up on NiVeR, this is what you'd use a DB forDGK– DGK2018-06-27 07:13:21 +00:00Commented Jun 27, 2018 at 7:13
-
adding to Laurens - Once you are using a db, you may want to use a map as a cache. There are many caches available to do this. Set up good eviction policy.gagan singh– gagan singh2018-06-27 07:15:27 +00:00Commented Jun 27, 2018 at 7:15
-
My question was more specific to cache implementation in this case.Thanks a lot for pointing out the usage of DB :) I should have asked more clearly about the cache implementation.Abhishek– Abhishek2018-06-27 13:27:26 +00:00Commented Jun 27, 2018 at 13:27
Add a comment
|
1 Answer
If this map can increase indefinitely, you need a different solution than just a HashMap in memory. Even if you keep increasing the JVM's limits, the actual hardware you're using is finite.
You could instead keep these keys in a database (or file, for that matter), and query against it. To improve performance, you can decide to keep the N recently used keys in memory as a cache.