0

While setting up Stanford CoreNLP, I am running into following exception since a long time:

Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
    at java.lang.StringBuilder.toString(StringBuilder.java:407)
    at java.io.ObjectInputStream$BlockDataInputStream.readUTFBody(ObjectInputStream.java:3388)
    at java.io.ObjectInputStream$BlockDataInputStream.readUTF(ObjectInputStream.java:3183)
    at java.io.ObjectInputStream.readString(ObjectInputStream.java:1863)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1526)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
    at java.util.HashMap.readObject(HashMap.java:1402)
    at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2136)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
    at edu.stanford.nlp.io.IOUtils.readObjectFromURLOrClasspathOrFileSystem(IOUtils.java:310)
    at edu.stanford.nlp.coref.statistical.FeatureExtractor.loadVocabulary(FeatureExtractor.java:90)
    at edu.stanford.nlp.coref.statistical.FeatureExtractor.<init>(FeatureExtractor.java:75)
    at edu.stanford.nlp.coref.statistical.StatisticalCorefAlgorithm.<init>(StatisticalCorefAlgorithm.java:63)
    at edu.stanford.nlp.coref.statistical.StatisticalCorefAlgorithm.<init>(StatisticalCorefAlgorithm.java:44)
    at edu.stanford.nlp.coref.CorefAlgorithm.fromProps(CorefAlgorithm.java:30)
    at edu.stanford.nlp.coref.CorefSystem.<init>(CorefSystem.java:40)
    at edu.stanford.nlp.pipeline.CorefAnnotator.<init>(CorefAnnotator.java:69)
    at edu.stanford.nlp.pipeline.AnnotatorImplementations.coref(AnnotatorImplementations.java:218)
    at edu.stanford.nlp.pipeline.StanfordCoreNLP.lambda$getNamedAnnotators$17(StanfordCoreNLP.java:641)
    at edu.stanford.nlp.pipeline.StanfordCoreNLP$$Lambda$27/1579572132.apply(Unknown Source)
    at edu.stanford.nlp.pipeline.StanfordCoreNLP.lambda$null$33(StanfordCoreNLP.java:711)
    at edu.stanford.nlp.pipeline.StanfordCoreNLP$$Lambda$40/2104457164.get(Unknown Source)

System Details:

MacOS
java version "1.8.0_131"
Java(TM) SE Runtime Environment (build 1.8.0_131-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode)

Command for which the exception occurs:

java edu.stanford.nlp.pipeline.StanfordCoreNLP -file input.txt

I think I also have setup the right CLASSPATH. See the result for echo $CLASSPATH

:/Users/krishna/Downloads/NLP/stanford-corenlp-4.5.1/*
0

1 Answer 1

1

Fixed using the flag -Xms3056m which increases the default memory to 3GB to be used by CoreNLP.

java -Xms3056m -cp "*" edu.stanford.nlp.pipeline.StanfordCoreNLP -file input.txt
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.