Subtracting background_Time-Lapse Fluorescence Analysis

Hello CellProfiler Staff,
I am new to CellProfiler; I have been using ImageJ up to this moment. I need to analyze fluorescence intensity at a single cell level over time. The cells do not move during the experiment. Thanks to this forum, I have managed to set up the required pipelines in order to do such analysis. The analysis seems to work. However, the intensity values I obtain are completely different from what I would obtain using ImageJ (manual selection of cells). This is probably (and most likely) due to the fact that I did not subtract the background intensity in my image sequences.
FYI, we stain the cells with PI to mark their location. This is an example:

After reading the manual and many relevant posts on this forum, I have managed to come up with some different pipelines to correct for the background fluorescence. I’d like to share such pipelines with you and ask for your opinion. Am I approaching this the right way? Which pipeline do you think is best tackling this problem? Do you suggest something different?

I hope you guys can help me solve this issue as well. Particularly, I’d like to understand how to properly judge whether or not the background intensity of an image has been properly removed.

Thank you in advance for your time, I really appreciate it.

Giammarco

Hi @GiammarcoNebbioso,

Great work generating so many options for background correction! A few comments / answers:

1.Your masked approach is not changing your image because every pixel is included in the foreground (value 1). As a result, this won’t change the anything if applied as a mask to another image. You may need to adjust your thresholding method. That said, if you can identify your objects from the background, you may not need to do background subtraction at all.

  1. Both of the backgrounds the you created with the CorrectIlluminationCalculate method look reasonable to me. I would probably try increasing the size of the smoothing filter using the manual mode in order to get a more smooth image for your background image, which will result in a less pixelated output when it is applied.

Your last question is probably the hardest to answer. In general, I look for:

  • background corrected images have even illumination across the entire image
  • dim objects that I think are real in the original image are not lost in the background-corrected image
  • borders of objects appear to be the same in the pre-correction and post-correction images (another way to say this: objects don’t appear to shrink in background-corrected images)

I think those are some good guidelines to start, although others may have additional thoughts. Good luck!
Pearl

1 Like

Can I dig into this a bit more? Are you also background subtracting before manually selecting in ImageJ? If not, I don’t think you need to do it in CellProfiler either to get the two to “match”. I would suggest doing background subtraction in your case if it looks like the background is actually changing over time; otherwise, your trends shouldn’t be affected by the background (and in fact might be HARMED by background subtraction if the calculated background itself were inconsistent- at a minimum, I’d be sure to measure intensity in both raw and subtracted images, just in case!).

1 Like

Hello @bcimini
Thank you for your suggestion.
Yes, when I use ImageJ I subtract the background before manually selecting the cells. I simply subtract the background using the Rolling Ball Radius algorithm.

I do not expect the values to ‘match’ completely, of course. However, I get very different results when I use the data from ImageJ and CellProfiler. I can confirm that, in my case, the background is changing over time and I need to take this into account.

Yes, when I use ImageJ I subtract the background before manually selecting the cells. I simply subtract the background using the Rolling Ball Radius algorithm.

Ah, cool, makes more sense then!

The functions you’ve posted for the second two methods seem reasonable; I might smooth them a BIT more but they certainly seem good. Otherwise, @pearl-ryder 's advice on “how do I know when I’m doing it right” seems excellent!

Thank you @bcimini !
I just had another quick question.
Could you please help me understand the following image obtained from "CorrectIlluminationApply’?

I am interpreting it as follows:
The image in the middle is the calculated background and it is subtracted from the original image (the one on the left) to obtain the final image (corrected background). However, the most fluorescent spots in the image in the middle seem to appear where the actual cells are located. If this is true, wouldn’t I be wrongly subtracting the fluorescence from the actual cells as well?

In short, if I do this, am I actually subtracting mainly the background?

Based on the image you’re showing, no, your current parameters are NOT actually grabbing mostly background, at least in this image. I would suggest larger block sizes and smoothing.

Nice catch!

but the possibility to launch an ImageJ plugin from Cellprofiler is no more present?
Because I remember there was this opportunity and in this case, I think, it could be useful for @GiammarcoNebbioso allowing him just to launch the rolling ball algorithm from ImageJ.

It is actually recently back! Check out our wiki page on it.

skimage also recently implemented a rolling-ball, so it’s on our to-do list to pull that into CellProfiler.

2 Likes

Thank you for bringing this up @emartini,
I was completely unaware of the existence of such plug-in!

1 Like