0

HI i am trying simple example from this official website of stanford CoreNLP example: https://stanfordnlp.github.io/CoreNLP/api.html

***** TokensRegexNERAnnotator ner.fine.regexner: Read 585586 unique entries from 2 files Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded at java.util.LinkedHashMap.newNode(Unknown Source) *****

i had tried all solutions available on net, but i am unable to resolve issue i had tried by increasing memory size in eclipse.ini file also tried by putting -Xms1024m in run configuration arguments. I am working on my thesis tool please help me out i am stuck i am using eclipse oxygen and stanford-corenlp-3.9.0

Thanks!

6
  • Try first using even more memory. 1GB is not THAT much for NLP, NLP is rather memory-hungry. Commented Mar 5, 2018 at 16:17
  • The memory size in the eclipse.ini is the size Eclipse uses for its own code. When you run a program the memory size is specified in the 'Run Configuration' for the program. Commented Mar 5, 2018 at 16:22
  • ok i am trying wait Commented Mar 5, 2018 at 16:25
  • oh no, by changing to -Xms2048m in run configuration arguments it just process from few more lines but still after few processing got same error Commented Mar 5, 2018 at 16:30
  • TokensRegexNERAnnotator ner.fine.regexner: Read 585586 unique entries from 2 files Adding annotator parse Loading parser from serialized file edu/stanford/nlp/models/lexparser/englishPCFG.ser.gz ... done [0.7 sec]. Adding annotator dcoref Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded Commented Mar 5, 2018 at 16:30

2 Answers 2

1

final done by changing to -Xms3056m in run configuration arguments, mean i need more RAM because NLP takes much RAM to execute and compute

Sign up to request clarification or add additional context in comments.

Comments

0

Eclipse is an IDE. It has nothing to do with your problem.

This is a JVM runtime issue.

You need to answer a few questions:

  1. Which version of JDK are you running? JDK 8 eliminated perm gen and added meta space to the memory model.
  2. Are you running a 64 bit JVM?
  3. Have you profiled your app with Visual VM to see what the generations in memory are doing?

You can increase the max heap size beyond 1GB.

4 Comments

i think i have JDK 8 and 64 bit JVM but can you please tell me how to check these things you are asking for?
Do a Google search and read up on JDK 8 memory configuration: blog.sokolenko.me/2014/11/javavm-options-production.html
final done by changing to -Xms3056m in run configuration arguments, but it hangs my laptop, but it completes task.. hurrryyy but do you know any other solution because my RAM is only 4 GB and if i give 3 GB to program my other work hangs
You need more RAM. Stop everything else that is running on your laptop, including Eclipse. It's a big memory hog. Run outside of Eclipse.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.