I noticed that my quantification changes if the orientation of my import images is different.
I am using the IdentifyPrimaryObjects Module to identify objects above a certain threshold and in a certain size range. I though that the quantification should be the same if the image is the same regardless the orientation. However I notice slight differences when rotated 180 degrees and big differences in quantification when the image is rotated e.g. 20 or 45 degrees. Why is that, isn’t the module looking for above threshold pixels and they should be the same regardless orientation? What can I do to normalize for that effect?
Thanks for the help,