I am making some simple-trials with Blur measure from MeasureImageSaturationBlur module. I selected 3 images taken about the same cells, but with a very different focus quality. Let’s say image1=good, image2=normal and image3=bad.
When I measure the blur in the original images, the results are:
image1: 0.041 (good)
image2: 0.035 (normal)
image3: 0.029 (bad)
I am happy with these numbers because make sense. The better focus, the higher value.
But, somehow surprisingly to me, when I do RescaleIntensity (because of 12 bit CCD images and 16 bit format) and try the same thing, the results are:
image1Rescaled: 0.011 (good)
image2Rescaled: 0.013 (normal)
image3Rescaled: 0.015 (bad)
which means that is not only that there is a bias in the measure once images are rescaled, but also that the score is kind of inversed, giving higher values to the worst focused images. Any idea what might be happening? May be I should just apply blur measure to the original images and forget about this subject?
Thanks a lot for your help.