Running multiple headless instances in parallel

I have 100+ files that I want to run segmentation for all files at once in headless mode. I could open a new terminal and run headless per file (or a group of files). I am wondering if there is way to run all files at the same time (loop over files and submit background jobs). I have tried running multiple headless instances in parallel in the same bash terminal using “&”, but it throws this error message:
fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)
File “h5py/_objects.pyx”, line 54, in h5py._objects.with_phil.wrapper
File “h5py/_objects.pyx”, line 55, in h5py._objects.with_phil.wrapper
File “h5py/h5f.pyx”, line 78, in h5py.h5f.open
OSError: Unable to open file (unable to lock file, errno = 11, error message = ‘Resource temporarily unavailable’)

Thoughts?

Looks like the model file gets locked by the first instance. It works if I make a temp copy of the model file for each headless instance and run using “&”

oh, I thought this had been answered already. You’ll have to add the --readonly command line flag. The error occurs because h5py allows for only one instance with write access on a file.

I didn’t know about --readonly flag. I don’t think it’s on the headless mode operation instructions. Thanks.