Memory error, due to cache build-up?

Hi,

I keep running into a memory error during analysis of large image datasets.

I use imaging mass cytometry, which generates large .mcd files, that I use a plugin in fiji to convert into .tiffs (a set of 39 tiffs). These .tiff files are then used in a cellprofiler pipeline for cell segmentation and extraction of single cell data using the cell mask.
My pipeline has been automated to process files straight from the machine as .mcd files to get a cell mask output and .csv. The pipeline runs fine with <6.24gb .mcd files (<43.5mb each .tiff), but with a 7.28gb .mcd (101mb each .tiff) my pipeline gets stuck at identifying secondary objects (one of the last few modules) with a memory error.
Is it possible that the pipeline is caching data from previous modules, and is there a way to delete this cached memory?
I read that there used to be a conserve memory module but i cannot find it in CellProfiler3.1.8.

It would be great if anyone has any advice about this!

Thanks,
Karishma

Hi Karishma

You can do couple things, but this depends on what the memory issue is

1- limit the number of multiple processes (ie if you desktop have 4 cores with 16g ram, limiting cores gives more memory open to processing)
2- change the default java memory that CellProfiler has to process
3- change the default temp directory location

Best
Lee