HDF5 flush problem



Hi all,

I’m trying to run CP on a linux workstation. CP seems to load fine and the pipeline works in test mode but when i went to analyze, after the first image set, I get the following error:

Traceback (most recent call last):
File “/home/johnfuller/CellProfiler/cellprofiler/analysis_worker.py”, line 422, in do_job
File “/home/johnfuller/CellProfiler/cellprofiler/pipeline.py”, line 1969, in run_image_set
File “/home/johnfuller/CellProfiler/cellprofiler/measurements.py”, line 384, in flush
File “/home/johnfuller/CellProfiler/cellprofiler/utilities/hdf5_dict.py”, line 343, in flush
File “/usr/lib/python2.7/dist-packages/h5py/_hl/files.py”, line 167, in flush
File “h5f.pyx”, line 105, in h5py.h5f.flush (h5py/h5f.c:1876)
RuntimeError: unable to flush file’s cached information (File accessability: Unable to flush data from cache)

Originally I tried writing to database, and then tried writing to batch file but still had same error. Thanks!



Hi John,
I’m guessing a bit here, but I’m wondering whether you have proper access privileges to the temp folder designated by CP. If you look under File > Preferences, you’ll see a preference called “Temporary folder”. Can you confirm that you can read/write to that folder? Also what version of CP are you using?


I am facing the same issue, I checked the temporary folder and I can see that I dont have read write permissions, what should I do now?
I am using cellprofiler for more than a month, this is the first time I tried cellprofiler batch processing module and trying to process around 2000 images. It worked fine till 800 images then it got crashed and gave me some error. I am uploading error that I got on my terminal in attached file, kindly have a look and tell me what should I do. Need urgent help please.


cellprofiler error.txt (194.9 KB)


It looks from the error message you posted like you ran out of space on your temporary directory. You either need to process in smaller batches, clear out some disk space in the temporary directory (or use a larger disk), OR if you’re using ExportToSpreadsheet try ExportToDatabase, which uses up the same amount of storage space but not on the temporary directory.


Okay, I will try out your suggestions. One more issue that I am facing is that when I run images in small batches, the spreadsheets overwrites, and in the ending I just get single spreadsheet for the final batch, for rest of the batches I am not able to see any spreadsheet. I am using -o option to give output folder destination but seems like it is not working.
Kindly give me some suggestions on this, as soon as possible.



Are you giving the same output destination for each batch, or separate ones? If the latter, can you report the commands you used so we can attempt to debug it?