I would need a script, which would find a maximum brightness on an image stack and then correct the Brightness/Contrast settings to a minimum display value of 0 and maximum display value of e.g. 0.9-times the brightness maximum.
Currently, I am doing this manually. I have an image stack (16-bit images in grayscale), I run histogram on the stack (CTRL+h) then I look at the log scale of the histogram (so I can see the few pixels at the max values) and manually correct the Brightness/Contrast settings. I have hundreds of stacks with different max brightness. Usually, the images have only a few pixels near the maximum so I visually decide what max value to use (e.g. 0.9 of the max intensity – i.e. 10% below the max brightness).
Is there a way to automate this? The best way would probably be a script, which I would run on all open stacks and where I could manually change the max intensity factor.