I noticed some strange behaviour when viewing noisy images with Fiji.
I was able to reproduce the issue on a second machine as well using simulated noise.
The image below was produced by creating a new empty image (single channel 16bit, 3838x3710px) and then applying noise (“add specified noise” with std 250)
Eventhough the noise is i.i.d., when I zoom out I see a grid pattern with four maxima:
The pattern looks different when I use a different image size.
I’m thinking this might be a strange optical illusion/visualisation artifact. I’ve just done the same thing you did and had the same impression, but then I plotted the profile of a line drawn diagonally through the image:
I’m not sure if I see anything untoward there. To double-check, I’ve tried fitting a high-order polynomial to the profile and got what was essentially a horizontal line, with any other coefficient being very small. It’s a strange one!
I opend the screenshot from above in Fiji and looked at the histograms in the center and at one of the maxima. I do not think this is an optical illusion. I think it is a display problem involving the zoom, like schmiedc suggested.
The original image is a checkerboard where the grayscale
alternates between two values from pixel to pixel.
The next image is the checkerboard downscaled by an
exact factor of eight to 512x512. In this case there is no
moire pattern (or, if you will, the length scale of the moire
pattern goes off to infinity).
The following three images are the original image downscaled
by successively smaller factors slightly less than eight. They
show moire patterns of (as is to be expected) decreasing
So when you zoom out of an image (or Fiji zooms out for you
to fit the image on your screen), resize() (or some near
equivalent) is called, potentially producing this sort of moire
pattern. It’s less obvious in the “noise” image, but the regular
pixel structure of the noise is enough to produce one.