“'OSError: [Errno 23] Too many open files in system:' When importing functions on all engines”

Refresh

April 2019

Views

12 time

0

I am trying to process a list of objects in parallel on a cluster using ipyparallel but I am having an error saying too many files open. I am using Jupyter notebook and can start 230 engines on the cluster.

I am using Jupyter notebook and can starte 230 engines on the cluster. When trying to import the functions on the engines I am having the following error:

`OSError: [Errno 23] Too many open files in system:`
The common solution on Google is to increase the limit on ressources: ulimit -Su  20000
`ulimit -a` gives me 


`core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 515236
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 65536
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) unlimited
cpu time               (seconds, -t) unlimited
max user processes              (-u) 4096
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited`

Here is the code

`import ipyparallel as ipp
rc = ipp.Client()
dview = rc[:]
lbview = rc.load_balanced_view()
with rc[:].sync_imports():# import numpy here and everywhere
    from foo import bar`

The code works when I decrease the number of engine to ~130. I tried loading the functions asynchronously (on half of the engine then on another half). But the error happens when loading them on the second. It seems like on files remain open on the node the Jupyter notebook is running.

BND

0 answers