Tips for Losing Less Data when Binarizing and Reducing False Positives

Hi all,

First time poster here.
I’d like to reduce my error rate when trying to automatically count cells with the Analyze Particles function.

Each time I binarize the photo using Adjust Threshold, I tend to lose some cells.

One issue I am aware of is that the photo quality is rather poor; namely, the lighting is uneven, so there are usually dark spots shading some cells, which means that I either lose the cells in that shady patch or have a large black blotch with poorly defined cells within it. If I then Fill Holes, for example, sometimes this whole blotch is filled, losing a number of cells. The Watershed function isn’t always enough to define the cells and retrieve that data.

Moreover, the binarize function creates a fair bit of noise in the form of these blotchy patches of black, which cannot easily be removed with the Remove Outliers function, without simultaneously losing some cells. When, in these instances, I Fill Holes and Watershed, these blotches often get divided up and create a number of false positives when I Analyze Particles. I tried playing around with raising circularity to as high as 0.7-1.00 because these blotches tend not to be particularly round, although that has not totally solved my issue.

I take some consolation in the fact that these two tendencies may somewhat balance each other out, as my loss of data may be somewhat accounted for by my false positives, but I’d still like to do better. I do not need 100% accuracy. I’m only trying to get fairly good estimates, so anything I can do reduce my error would be great.

Any suggestions for how I may lose less data and limit false positives?

Thanks!
Herm

Good day,

the short and most important answer is:

Redo the image acquisition because the best image enhancement is no image enhancement.

Sometimes image processing may compensate for flaws in sample preparation and image acquisition but at very high cost and sometimes not at all.

What you describe has been described about 100 times before on this Forum and sometimes problems could be overcome with specific solutions that turned out to not even work with very similar images …

Counting may be done without explicite thresholding. Did you have a look at “Find Maxima…”?

Regards

Herbie

1 Like

Hi Herbie,

Thanks for your reply. Yes, I am well aware that I need better images. Part of what I am exploring is what one can do when all they have is poor quality images. It seems like not very much!

What do you mean when you say at high cost:

Would you be able to link any other threads that may have helpful information? I have not found much help thus far in my searches.

I have not looked much into “Find Maxima”… I did a while back but will revisit it and see if that helps!

Thanks very much for your time,
Herm

Herm!

What do you mean when you say at high cost:

In many cases it turns out that the effort for developing and implementing a digital image processing approach to at least partly compensate for bad image quality is greater than the effort or cost to redo the sample preparation and image acquisition, i.e. to generate optimum image quality.

Would you be able to link any other threads that may have helpful information?

There are hundreds of similar cases discussed on this Forum that deal with thresholding (and segmentation) of problematic images.

Here is just the latest case:
http://forum.image.sc/t/problem-to-measure-adjusted-threshold/10674?u=herbie

Perhaps you post a typical raw image in the original TIF- or PNG-format. No JPG-format though, because JPG introduces artifacts! You may also post images as Zip-archives.

Regards

Herbie

Hi Herbie,

Thanks for taking the time to respond. Your aid is essential!

I see now what you mean about cost. Thanks for spelling that out.

I read through the thread between you and Nasir and found it interesting, although I’m not sure how it may help.

Unfortunately, my images were taken with a digital camera with limited capacity, in the field, during ecological surveys. Thus I have only .JPG files, which is my first problem–and poor image quality, which is my second! Haha.

I sadly lack comprehensive familiarity with ImageJ and feel as though I am groping in the dark to suss out insight from the user manual, which I have studied (rather) thoroughly. I will go back to Find Maxima in the guide and see if any methods can be employed that take advantage of this function.

A follow-up question:

Do you know any ways to better separate two cells, which have been counted as one, but are in fact two, other than the Watershed function? If there were some way to better distinguish these cells, with a tool that is perhaps more sophisticated than the watershed function (that is, which has variables I can adjust), I could likely reduce double counting and also false positives.

It seems that one of my issues could be resolved if the dark blotches in my images weren’t broken up into many cell sized pieces by the Watershed function. Fill Holes helps to some extent, but not enough, it seems.

I eagerly await your response,

Herm

EDIT: I see that Find Maxims can segment my particles for me through a different algorithm than the Watershed function, which uses EDM. I will explore this presently.

Good day Herm!

Poor image quality can be avoided, even in the field!
It is just a matter of planning and experience and if the latter is lacking, you have to ask colleagues who are experienced.

This Forum is full of cases where people think that image processing/enhancement can remedie sloppy sample preparation and image acquisition. My impression is, that all the (dirty) work without the (clean) computer is regarded as necessary burden that needs to be minimized. This however is simply not true at all. Today, the interface between the physical world and the virtual world of informatics represents perhaps the most important part of empirical science.

re: A follow-up question

Automatic image analysis/processing is especially useful if you have to a lot of images to process. Hence, all approaches that use hand adjusted parameters are of no help. They need to be individually set for each image, which makes automatic processing impossible.

The more important aspect of automatic image analysis/processing is objectivity and this is what is required for any kind of scientific work!

If you try to get better results by hand set parameters, you will leave scientific grounds.
Don’t do it!

Please be aware of the fact that visual inspection and judgements can often not be replicated by machine operations, i.e. for example that clear borders for you may not be clear borders for a machine …

Good luck

Herbie

Herbie,

I see. I understand. I agree with you about the importance of connecting informatics with the physical world!

I also agree with your statements re: the importance of objectivity and the elimination of human error in the image analysis process. For some, methods that require each image to be individually adjusted have deeply limited relevance. That said, even manually adjusting parameters for each image is much faster than many other methods, which employ various kinds of manual/hand counting.

My aspiration, if not already explicit, is to see what can be done in the case that one has poor images. What is salvageable? What is too far gone? What techniques can be employed to make the most of what one has, etc.

The more time I spend in this forum and tinkering with this software, the more I feel that image acquisition must be a fastidious and rigorous process. It seems that little can be done, otherwise.
Thus we return to your short and simple answer, with which we began. That, in itself, is highly useful information for me, knowing for myself the truth of your statement.

Thanks again for all your time. I’ll follow-up with any successes I’ve had or any other questions, as they arise.

Best wishes,
Hermes