Limiting memory consumption with large image sets

Hi all,
Processing a very large set (tens of thousands of images) of low-resolution images (175x175px) and I keep running into memory limitations (I’m currently restricted to processing on a 32-bit windows system but should hopefully be moving to a 64-bit system in a few months). I already have the pipeline dump each image after processing it but sometime after the 6000th image or so the measurements table seems to become too large and brings down the pipeline. I’m wondering whether its possible to write out the measurements of each image to file every time the pipeline iterates instead of aggregating the measurements to do a large write at the end.
Thanks

Sorry it’s taken so long to get back to you on this. I’ve added this to our list of features to potentially add in the future.
-Mark

Hi,

We have recently made fairly extensive changes to the way measurements are handled, namely by using a Python interface to the HDF5 file format system to create temporary storage for files.

These changes will be included in the next release, but if you’re daring, you can try it out with our latest public build from source code here. I should note that while the changes are essentially completed, we have not fully assessed their stability, so you should take that into account; please note the caveats mentioned on the page linked. Let us know if it works for you.

Regards,
-Mark