2

Hardware setup is 64x 64bit CPUs, 380GB RAM.

Java/Lein settings are:

#export JVM_OPTS=-Xmx254g -Xss2g

Running a large parallelised Clojure inference algorithm I get

java.lang.OutOfMemoryError: GC overhead limit exceeded

Yet maximal memory usage by the process is around 30GB.

What settings do I need to change? I do not understand why the GC insists on trying to free up memory - there should be plenty to go around!

When limited to 10GB memory and 10 CPUs, the algorithm does not encounter this problem.

2
  • You need to post your GC log. Otherwise it's hard to tell how the memory was used etc. Just the basic logging would be fine (e.g. -verbose:gc -XX:+PrintGCDetails -Xloggc:gc.log). Commented Apr 16, 2016 at 14:51
  • Okay, will do - but it might take me a couple hours to run into the issue again! Commented Apr 16, 2016 at 14:58

1 Answer 1

2

I suspect

#export JVM_OPTS=-Xmx254g -Xss2g

is a comment. This means your default heap size of 32 GB (as you have 128+ GB) is being used. Try removing the # Additionally if export JVM_OPTS= appears anywhere after this line, it will override this setting.

Sign up to request clarification or add additional context in comments.

2 Comments

Should I delete the question? It likely won't help anybody - I had #PBS directives above this and accidentally typed that hash in.
@Oxonon it's up to you. it might be useful to know what happens when no -Xmx has been set.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.