Thresholding in batch mode in ImageJ  3  

CCViva1-F25-NR(500ml).zip (10.5 MB) Results(37,175).csv (25.9 KB)

Hi,

I have realised that when I run “Analyze Particles” from individual images, I normally get different results than when I analyse a set of images running a macro code using the batch mode - even using the same analysis conditions -. One of the problems I have noticed - I think the main one - is that the most adequate threshold values may be different for each image, even if they were captured at the same exposition time. As an example, I attach a set of pictures from the same sample (the sample is a mixture of organic particles and microplastics dyed with Nile Red), as well as the Results table I got. Before running the code, I always chek the threshold in few pictures to obtein the most adequate values. For this set of samples, a threshold limit of 50 and 175 seemed to work well for most images, but in the result table I realized that, for example, the bigger particle in picture 7 was not analyzed as one, but as a set of smaller particles. Then I changed the minimum threshold value to 37, as a compromise between “filling” that particle without increasing the noise. I hoped that the “include holes” command would fill the hole left in the particle, but results still show a bunch of small particles. Here is my code:

run(“Close All”);
run(“Clear Results”);
roiManager(“Reset”);

//Set input folder
input = “/my folder/”;

//Set output folder
output = “/my folder/Results/”;

//Set batch mode
setBatchMode(true);
list = getFileList(input);
for (i = 0; i < list.length; i++)
action(input, output, list[i]);
setBatchMode(false);

function action(input, output, filename) {
open(input + filename);
run(“Set Scale…”, “distance=0.3875 known=1 pixel=1 unit=µm global”);
// distance= pixels/µm
run(“Subtract Background…”, “rolling=1500”);
run(“8-bit”);
//run(“Threshold…”);
setThreshold(37, 175);
setOption(“BlackBackground”, false);
run(“Set Measurements…”, “area mean perimeter fit shape stack limit display redirect=None decimal=3”);
run(“Convert to Mask”);
run(“Analyze Particles…”, “size=500-Infinity show=[Masks] display exclude include add”);
}

//Save results as a table
saveAs(“Results”, output + “Results.csv”);

Unless I am doing anything else wrong, I do wonder if is it really a good idea to use the batch mode when you may have images which optimal thresholds that differ quite a few? May it better to analyse just some of the images individually and extrapolate results? or remove those pictures with very different optimal threshold values (e.g. Image 7 in my case); (2) I have also noticed that when I use the batch mode, “Mean” values tend to be close to 255, while this does not happen using the same analytical conditions for individual images. May be any reason for that?

Any help will be very appreciated. Thank you very much!

I would strongly recommend to analyze stuff in batch and using as consistent settings as possible over your dataset. Otherwise you negate all the advantages of quantitative image analysis. Like speed and unbiased analysis.

That the “Mean” values are 255 indicates that you measure in the binary mask and not the measurement channel. That seems to be a bug in the macro. Please specify the channel you want to use for the measurement: Analyze > Set Measurements
Then use redirect to point to the channel you want to use for the intensity measurement.

For the thresholding problems, have you tried to use automatic thresholds?
Using:Image > Adjust > Threshold…
Then selecting a automatic intensity thresholding method.
These select the intensity threshold value based on the histogram.
This can overcome the problem with varying intensities.

For getting a better segmentation I would also use a filter before applying a background subtraction. This smooths out the signal and suppresses noise, but can also be use to smooth out inhomogeneities from your object.
You could use a Median filter if you want to preserve the edges of your objects: Process > Filters > Median…
NOTE: After applying a non-linear filter like the median filter do not use the filtered image for measurements!

I would also try to use a sensible radius for the Background Subtraction. Now you have set a size of 1500 pixels. The rolling ball radius should be as large as your largest object. Otherwise your background subtraction is not effective. Especially since your images are rich in background and on a first glance also have uneven illumination. This will be also a problem with the intensity thresholding.

Mask of CCViva1-25-500ml_x2.5_20ms_field9(29,175).tif (1.4 MB) Mask of CCViva1-25-500ml_x2.5_20ms_field9(autothreshold).tif (1.4 MB)

Many thanks for your response and all the advice, Schmied.

I have modified the code and set the default autothreshold, but results are much worse now. As you can see in the mask of image 9, now a big black spot is observed, compared to the clearer number of particles observed in the mask for the threshold values 29, 175 (both images are attached). As a consequence, I get a much higher number of unwanted particles on my results table. I have tested a similar code in different samples and the same happens. Maybe I am not using the correct code for automatic threshold? This is the part of the code I have modified:

function action(input, output, filename) {
open(input + filename);
*run(“Set Scale…”, “distance=0.3875 known=1 pixel=1 unit=µm global”); *
*run(“Subtract Background…”, “rolling=775”); *
*run(“8-bit”); *
setAutoThreshold(“Default”);
setOption(“BlackBackground”, false);
run(“Set Measurements…”, “area mean perimeter fit shape stack limit display redirect=filename decimal=3”);
run(“Convert to Mask”);
run(“Analyze Particles…”, “size=500-Infinity show=[Masks] display exclude include add”);
}

Again, thanks very much!

If you use the drop down menu you can select from different autothresholding methods. This allows you to select a thresholding method that is able to better segment your objects.