This kind of exception is often misleading as the causex of it can be quite different.
The first thing you might try in the checklist is reducing the stack size. The JVM has an interesting implementation, the design of which I don’t completely understand, but the implication is that the more memory is allocated for the heap (not necessarily used by the heap), the less memory available in the stack, and since threads are made from the stack, in practice this means more “memory” in the heap sense (which is usually what people talk about) results in less threads being able to run concurrently.
To reduce the stack size, add “-Xss64kb” to the JVM options. I suggest you start with 64k, try the application, then if it doesn’t work (it will fail with a Java.lang.StackOverFlowError), increase the stack to 128k, then 256k, and so on. The default stack size is 8192k so there’s a wide range to test.
At operating system level, Linux users you can control the amount of resources (and in particular file descriptors) you are going to use in the limits.conf file or usint the ulimit command:
# vi /etc/security/limits.conf
testuser soft nofile 4096
testuser hard nofile 10240
# ulimit -Hn
In either case, if the amount of file descriptors you are allowed to use is less the number of threads you are going to create, then this could be the issue.
One more thing you could finally investigate is the number of process per user:
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 515005
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 4096
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 10240
cpu time (seconds, -t) unlimited
max user processes (-u) 1024
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
ulimit -u 4096