Histogram and bit-size

Greetings,

I am not sure if this is a trivial question or not:

Essentially I have two images one of which appears brighter on the average than the other. There are two ways of exporting the image: the first is as an 8-bit color TIFF image, and the second is as a 32-bit csv(text) image. I then convert the 8-bit color TIFF to an 8-bit TIFF in Image-J.

The problem I am having is that when I compare the histograms between the two images, the average pixel intensity of the 8-bit TIFF versions of the two images is higher in the image which appears brighter (as it should be). However in the 32-bit versions of the same two images the average of the histograms is reversed, i.e. the image which still appears darker shows a higher average pixel intensity.

My question is whether this discrepancy is due the differences in the pixel format (i.e. 8 bit versus 32 bit), or perhaps how the histogram is constructed. Or, might it be due to differences in the way in which the software from which these images are generated is exporting the data (i.e 8-bit color TIFF versus 32-bit csv).

Thank You.