Inconsistent Nuclei count

Hi,

I posted a question sometime ago with no response and I guess I was just no very clear. I have the following pipeline attached and have images that vary in illumination and stain intensity (attached).
I was successful in correct the illumination to a point where they look similar, but my nuclei counting is still very inconsistent, ecluding some clear stained nuclei and including some really poorly stained ones.
The attached pipeline is as follows
CorrectIllumCalculate (regular, all first cycle, filter “median”, size 300)
Morph (invert)
CorrectIllumApply (divide)
CorrectIllumCalculate (Background, size 25, each, filter “median”, size 100)
CorrectIllumApply (subtract)
Enhance&Suppress Featrures (Enhance, Speckles, size 25) This module can be used or not.
IDPrimaryobject (Global, MoG, 0.05, automatic smoothing)
BUT I tried all the modules with the range in the size I need. Basically the methods can be ajusted, but still allow for too many false positive and false negative.
Should I use a smoothing filter for the threshold, since I already smoothed during the Illum. correction?

I also tried using some filtering for the false positive, (which is in the pipeline), but the difficulty in at least getting all the true positives in bugs me.

any suggestions?



cell_proj.cpproj (82.5 KB)

Also,

one thing I still need to do with the images and did not put in the pipeline is to crop the images after the IDPobjects is done.
I was thinking if I should crop them BEFORE everything. and even make the correctIllumination with the croped images.
Problem is, I can’t think of a way to do it and still do the correctillumination "regular’ set to “all:first cycle”, without breaking into two pipelines, because that module calls the images in the input module. And I am really avoiding that.

Hi cesar.coelho,

Sorry for the lack of response before, but a long post demands a lot of our “spare” time! :smile:

OK, I am side-stepping a lot of your questions with what I believe is a simpler and hopefully better pipeline approach. See my attached pipeline. The basic idea is to not use any CorrectIllumination modules, but rather use a tophat filter to highlight the nuclei and at the same time depress most everything else (a “cheap” form of illumination correction!).

A few notes:
(0) Avoid JPG. These images are lossy. BUT the images are very big and so I understand why you opt for them. But if you can, try PNG which are not lossy and also comparable in size to JPG.
(1) CorrectIllumination across cycles with histology is usually not a good idea. It will make an average over all the slices and any variations in the slice boundaries, etc, will throw off the illumination function. The “Each” setting can be ok as in your second CorrectIlluminationCalculate.
(2) You are loading your color images as “Grayscale” in NamesAndTypes. You should change to “Color”.
(3) Try the UnmixColors module. It was made for histology! Better, especially given your large images, is to image directly in grayscale, so that rather than having to load all 3 color channels, your grayscale image will be much smaller in memory terms.

Re: Cropping first, yes that would be advisable, if only to make the processing faster. And note that in my attached pipeline without CorrectIllumination*, you wouldn’t run into issues.

Hope that helps!
David
DLpipe.cppipe (6.45 KB)

Hi David,

Thank you Very much for the new pipeline and the advises. As for the .jpg pictures, I realized that after a while. Will not make this mistake again.
The processing of the image is good. But I am running into the same vital problem: “Lots of false positives (cells that are too weakly stained) and false negative (undeclumped cells mostly)”.
Using your pipeline, I tried to use the RobustBackground, which sounds pretty good to me and approaches what I want regarding the false positives. But I cant make the declumping work for my cells for some reason. I tried basically all combinations I can with the declumping methods. Even some manual parameters. It works well for some cases, if there’s 2-4 cells kinda lined up, but if things get a bit more “crowded” then I lose a whole bunch of activated cells.

Do you have any suggestions for that?

Thanks again,

Cesar

David,

After testing some manipulations with Enhance&Suppress features, I realized that my problem with the bad declumping is due to the size of the feature in that module.
The way I understand the module (correct me if I’m wrong), it is supposed to exalt the particles in relation to background. However, if you have a bunch of clumped particles, they become bigger then the feature size and get the intensity diminished. And if I just increase the feature size, I diminish the extent to which they are enhanced from the background and possible non-nuclei particles. In the end, the feature size necessary to maintain the cumpled particles in a good intensity doesn’t result in a significant modification in the overall image.
For reference, my particles are size 10-30 pxls diameter, I was trying to use Enhance Speckles Feature 25 (and the number that maintian clumped particles is 100)

I know you are trying to not use the Correct Illumination, but I compared an image with CorrectedIllum (Background, block size 50, Median filter size 250), with an Enhanced speckles (Feature size 25) and the former seemed more similar to the original image regarding to the clumped particles intensity.

Any suggestion for that problem?
Do you think the Background IllumCorrection can be a way out?

Thanks in advance
Cesar

Hi David,

I actually must apologize. After running a few more runs, I actually realized that by setting my Enhance Speckles Feature size to 100 (with particles ranging from ~10-25), without any Illum correction (just like you said) I can get a satisfactory enhancement and maintain all clumped particles. In this manner, they are actually well declupmed. What I needed was to come up with a feature size to maintain their pixel intensity.

Thanks for the help. You guys are awesome.
Cesar Coelho

Hi Cesar,

Glad you got it working. We recommend an EnhanceOrSuppress feature size for Speckles just a little larger than the objects you are enhancing. Much larger just be ok like you are doing, but may slow down your processing time. However clumped objects would be a problem for this type of enhancement. It sounds like you are doing as good as you can with these tools.

Best of luck,
David

Hi David,

Indeed. I realized that the time of processing I ‘save’ taking all those steps from my first pipeline out I lose with the large enhance Speckles. For a next experiment I will try to incorporate some other strategy to make it faster.

Anyway, Thank you so much.
If there’s anything I, as a user, can ever do to help just let me know.

Cesar Coelho