CellProfiler Granularity Measurements

Hello @karhohs
My team has the same question about understanding what these Granularity features mean. I see a lot of similar questions there on the Forum but I couldn’t find the answer.

What is the “range of granular spectrum” and how it relates to the “radius of the structuring element”? In our task CP gives amazing results! Especially Granularity features, but still we can’t understand how GS1, GS2… and etc really relate to the physical sizes.
Is GS1 says smth about small dots? GS6 - about dots of the medium size, and GS16 about speckles? If so, then what is the physical range of the spectrum?
Is this range of GranularitySpectrum is universal for all images in dataset, or is it unique for each image?

The papers recommended for this module in manual also couldn’t help us to understand. Probably the main article on this topic is Ravkin&Temov “Bit representation techniques and image processing” 1988 (in Russian) we couldn’t even find in the internet. I even asked for it the author himself and also I asked the journal. But they didn’t answer.

The only last hope is to find an answer on the forum…

Thank you very much in advance!

The first thing I would say is that you should definitely check out the slides linked in the documentation for a biological example-

The recommendation on the numerical value cannot be determined in advance; an analysis as in this reference may be required before running the whole set. See this pdf, slides 27-31, 49-50.

In general, the way MeasureGranularity works like this-

  1. It downsamples the image if you tell it to (ie it might make a 512 x 512 image 256x256 if you tell it to downsample with 0.5)

  2. It does some background subtraction of anything LARGER than the radius you give it, in pixels (ie if you tell it 20, of anything larger than 20 pixels, which was 40 pixels in your full size image)

  3. For as many times as you tell it, it (and I’m simplifying here) gets rid of bright areas that are only 1 pixel across, reports how much signal was lost by doing that, then repeats. IE if you tell it to do that 10 times, the first time it will remove one pixel from all bright areas in the image, (deleting those that are only 1 pixel in size (aka 2 pixels in your original image)), and then tells you then it tells you what % of the signal was lost. It then takes the first-iteration image and does that again- now all the ones that were 2 pixels in size in the downsized image (4 pixels across in your original image) are gone, and it tells you then it tells you what % of the signal was lost between the first iteration and the second. The third time, you’ll lose all the signal that was originally 6 pixels across in your image, then it tells you what % of the signal was lost between the second and third iterations. This will continue for as many iterations as you want.

So granularity scale * 1/downsampling factor ROUGHLY corresponds to object size, but not exactly, and the output measurement is a relative one (ie the GS_3 in the example I said above is “what percent of the pixel intensity once I’ve deleted all the bright areas with a diameter in pixels >20 or <4 was signal that between 5-6 pixels in diameter”), but gives you a relative idea of how many things of ~ that size are present in your image, in the sense it will be high for “lots” and low for “not much”.

I’ve attached here a PDF of a jupyter notebook that shows steps 2 and 3 (doing a radius of 20 pixels, then 10 iterations). I hope that helps explain!

Granularity.pdf (1.1 MB)

4 Likes

Thank you a lot!
Now I succeed to understand which features I should choose.

I think many users still search for the answer, now everyone can understand. Thanks to this detailed explanation.

Hello there! Thanks for the explanations. But I am still unclear about a few things.

What exactly is iteration? There is no mention of iteration in the help text. Is it the range of the spectrum? Also, what if I don’t want to reduce any background (clean image) and I don’t want to down sample either? Do I chose 1 for both subsampling factors (for granularity measurements and for background reduction)? Or do I choose a very small number for the subsampling factor for background measurement, for example 0.0000001? Btw, the module produces output when both factors are 0. It shows a warning message, but produces an output. How can that be interpreted?

Iterations are, essentially, the range of the spectrum, yes. If you look at the posted Jupyter notebook (which is also now excerpted in the CellProfiler 4 documentation for MeasureGranularity, it becomes a bit more obvious what each iteration does.

If you don’t want to downsample, set the sampling size to 1; if you don’t want anything subtracted from your background, set the radius value very large (see quoted text below)

It does some background subtraction of anything LARGER than the radius you give it, in pixels (ie if you tell it 20, of anything larger than 20 pixels, which was 40 pixels in your full size image)

I’m honestly surprised that it returns anything with 0 for either of those factors; we should probably consider setting the minimum to 1 in such a way that it doesn’t just warn the user, but my advice for interpreting those values would be “don’t”. It almost certainly is just overriding those values to something else, likely the module defaults, but I’d have to check the source code to be sure.

1 Like