I’m analyzing 16-bit TIFF files acquired by transmittance mode scanning (B/W) of western blots on PVDF, visualized with chromogenic HRP substrate (Opti-4CN), which gives dark grey bands on a white background.
After opening the image file in ImageJ, I calibrate using measurements from a neutral density step tablet (OD 0.05 - 2.8; the Epson V550 scanner is linear over this range) to convert the grayscale values to OD values. From Plot Profile analyses along the length of a typical lane, the background OD is ~1.4 and the peak ODs range up to ~2.2. Thus, the band ODs range up to 0.8 OD above background.
When I use the Subtract Background with rolling ball radius set ~50 pixels the background is removed very nicely, with the baseline OD near zero. However, the peak ODs range up only to ~0.12, about 15% of the actual value. Larger rolling ball radii give similar results, smaller radii decrease the peak heights.
My guess is that Subtract Background operates on the grayscale file, even after calibration to OD, and the difference in grayscale between baseline and peak is then converted to OD using the calibration equation.
Can anyone suggest a way to subtract background as OD values from the peaks as OD values?