I'm having the following issue with Hbase.
I have a script which starts the HBase shell and inserts many rows into a table with a single column. I have tried inserting 10,000 rows but after about 1,700 I get the dreaded "java.lang.OutOfMemoryError: unable to create new native thread" error. I have tried changing the Java heap size from 1000mb default to 1800mb, but this doesn't allow me to insert any more than the 1700 or so rows.
However, I've noticed that I can insert 1000 rows, exit the shell, restart the shell, insert 1000 more into the same table, exit again, and so on and so forth. I don't really understand enough about the JVM to figure out why it's allowing me to do this in several sessions, but not allowing me to batch insert in the same session.
Can someone please explain to me what is going on here, and what I might do about it?
EDIT:
I am now using 64-bit machine, red hat linux 5, with Java 1.6. I'm giving HBase a heapsize of 20gb (I have ~32 gigs memory total). For stack size, I'm giving 8mb. The default on 64-bit is 2mb I believe; with 2mb I got this same error, and increasing it to 8mb did not help at all (I was only able to insert the same amount of rows regardless of stack size, ~1700).
I have read that decreasing the heap size could make this error go away but that did not help either. Below are the jvm options that I'm setting (everything is default except for stack size).
HBASE_OPTS="$HBASE_OPTS -ea -Xss8M -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC -XX:+CMSIncrementalMode"