Why analyze particle total area is not consistent with color thresholded area in image

I have the following two images:

Image1: patient-01-005-minus4wks

Image2:patient-01-005-Day0

What I want to do is to detect the redness area and quantify it.
For that I use the following step:

run("Duplicate...", " ");
run("HSB Stack");
setThreshold(131, 255);
run("Analyze Particles...", "display exclude include summarize stack");

The detected redness area looks like this:

Detected area image: patient-01-005-minus4wks

Detected area image: patient-01-005-Day0

And the stats looks like this:

As you can see in the stats the total area of Hue and Saturation in minus4wks data is much larger than in Day0 patient. Hue, 12 times larger and Saturation 4.6 times larger.
But when we look at the detected area by eye, clearly Day0 area is at least almost the same if not covered more area than minus4wks.

What is the cause of the discrepancy and how can I resolve that?

You are only looking at one of three slices. If you include the scrollbar you should see that you can change between slices. I am not sure why you want to do what you are doing, but you probably want to look for another way to quantify the area, since you are treating the hue, saturation, and brightness as different channels and getting the area in each. Essentially treating it like a fluorescent image, and the channels as independent of each other.

*I realize this doesn’t answer the question, but it looks like there may be more important issues with your method to resolve first, which might end up solving your problem.

**I take that back, somewhat. It looks like you have “exclude” selected, which is going to eliminate the majority of the measurement for the Day0 patient, since it is all one big connected piece that touches the edge.

One last thought, these images don’t seem to have been taken exactly the same, since one is fuzzier/out of focus as compared to the other. That means the distance likely was not the same for each, and if not, the area measurements are do not mean the same thing in each image. You can imagine that if you moved back a foot, the area would magically be smaller!

Even if the distance is approximately the same, being slightly out of focus will change the hue, saturation and brightness of each pixel, making the thresholds different for each image.

Dear @Peverall_Dubois
could you please post the screenshots with the settings you are using of:

  • Analyse Particles
  • Set Measurements

maybe it is some options there.

Ciao,
emanuele

Hi Emanuele,

Here is the screen shots of those:

P.D.

In addition to the “exclude” already mentioned, based on your image, you probably don’t want to “include” either. That will count healthy tissue surrounded by unhealthy tissue. I would recommend "show"ing the results to yourself, at least, so that you can see what you are measuring. Mask would probably work.

I really think the problem you are having is related to “include holes” in Analyze Particles menu.

Check this example:


Where I created a Donut with 255 value outside the hole and 0 inside the hole.

If you check “include holes” it will consider the holes in your image.
So the area will increase because it consider the hole inside and the mean intensity will decrease since there is an hole inside with 0 value.

I never understood the reason to have that “include holes” option there, but it does that and I think that this is the thing impacting your measurements.

Removing include would decrease his areas, but their area measurement in one image was below 1%. The dominant factor still looks like it is the “exclude” in their code.

1 Like

yes @Research_Associate, I didn’t see it checked… but I fully agree with you: exclude on edges will remove in their images wide regions of interest.

1 Like