Blur measure




I am making some simple-trials with Blur measure from MeasureImageSaturationBlur module. I selected 3 images taken about the same cells, but with a very different focus quality. Let’s say image1=good, image2=normal and image3=bad.

When I measure the blur in the original images, the results are:

image1: 0.041 (good)
image2: 0.035 (normal)
image3: 0.029 (bad)

I am happy with these numbers because make sense. The better focus, the higher value.

But, somehow surprisingly to me, when I do RescaleIntensity (because of 12 bit CCD images and 16 bit format) and try the same thing, the results are:

image1Rescaled: 0.011 (good)
image2Rescaled: 0.013 (normal)
image3Rescaled: 0.015 (bad)

which means that is not only that there is a bias in the measure once images are rescaled, but also that the score is kind of inversed, giving higher values to the worst focused images. Any idea what might be happening? May be I should just apply blur measure to the original images and forget about this subject?

Thanks a lot for your help.



The Blur measure is not invariant to scaling, as happens in the RescaleIntensity module, but it should scale each image’s score by the same value, if you’re moving from 12-bit to 16-bit. Did you use the “Enter min/max” rescaling method, and with what values? It’s also possible the rescale is causing more pixels to be saturated in some of the images, and they’re getting clamped to 1.

I can probably figure out what’s wrong if you send me your original images, and the pipeline you’re using for rescaling and blur checking. Let me know if that’s acceptable. My address is

Thouis Jones



I used recommended Enter min/max method with values 0,0.0625,0,1


Something is definitely odd, then. The blur measures should all change by the same amount (by a factor of 16 upward, in fact, which none of them do).

Can you send me your example images? I’ll try to figure out what has gone wrong.