When using the Solidity measure in CellProfiler, we get values greater than 1. By definition, Solidity cannot be greater than 1. Anyone had this problem before? Anyway to solve it other than “fix the source code”?
Yes. Solidity measure is not greater than one. I have not faced any such problem definitely with 3.1.8 (even with earlier version). Could you provide little more information like which versi
on of Cellprofiler you are using? Nevertheless, it worth to visit this disccussion page
Fujfilm Wako Automation (Consultant)
Hi Lakshmi, thanks for your reply - it’s the official download version from ~5 weeks ago. Can look it up if it helps. The definition of Solidity is clear to me. We looked at the problem a but more, and it seems to affect smaller objects more than bigger ones.
It would be great if you could provide an example pipeline and image where this happens so we can reproduce the issue and report a bug!
I will ask our student whether she can upload the pipeline and a 2D image that shows the described problem here, OK? Just give us a moment to get it ready…
OK, here it comes…would be great if this helps you look into it, thanks!
I am first uploading the pipeline:
Modified_ExampleSpeckles-15Sep2019.cpproj (160.0 KB)
Here is the raw image data to be analyzed:
Fish_Ctrl_2A_002-s2.tif (12.7 MB)
Here is an example image output where you can see the solidity greater 1 indicated:
And here are the segmented objects underlying this quantification:
Thanks for sharing. Yes as you had mentioned it affects the smaller objects. I am not sure if this could be a bug with the code. But you problem could be over come by change (increasing) the smoothing scale (try with 2). Remember, this might also remove few more small objects but the value doesn’t exceed 1.
But your number of objects had reduced.
solidity.pdf (159.7 KB)
Hope this helps as of now.
our immediate solution is to set everything greater than 1 to 1 in the further processing of the analysis results, your approach also seems to work.
In any case, the solidity calculation is a mathematically clearly defined operation. It cannot yield a value greater than 1, so yes, something is off. I’d be a lot easier if we know what is going on…currently I am a bit hesitant to accept our results as, well, solid.
Let’s see if someone from the CellProfiler Development team can get to it, would be great!
I agree that mathematically, solidity should never be >1; this is definitely a bug, I’ve checked it’s present in our newest version, and I have therefore reported it here.
Unfortunately, what you are looking it is indeed a “fix the source code” error, though if you wanted you could use FilterObjects to just remove the offending ones if you think they will further contaminate your downstream results.
I of course understand your hesitance to trust the rest of the measurements, but looking at which things this is happening in it looks like it’s only happening in a tiny corner case of pretty weirdly shaped objects. All I can say is the even the best, well tested libraries have bugs - our math library has been used for ~10 years and formed a lot of the basics of scikit-image, so while there are certainly still a few bugs left (you’ve just ID’ed one!) it is in most cases very reliable.
We will get to work on a fix; I will attempt to remember to update here when we have one, but in case my memory is less than 100% the link I posted is the best place to check for progress updates.
thank you very much for your response, I agree with everything.
My suspicion would be that different functions are used to calculate (i) the number of pixels in the convex hull and (ii) the number of pixels in the object. If an algorithm for the convex hull is used that, for. example, at corners counts pixels as half-pixels (not uncommon for e.g. perimeter), and the area calculation takes all pixels full, that would explain it. In this scenario, also smaller objects would be expected to be more strongly affected: errors at the object boundary will take more influence when the ratio of boundary/area is smaller, as in small objects.
I would go and hunt for this myself, but I have never looked into the CP code. Plus, for such a core functionality it is better a core developer looks into this.
Thank you very much in any case!
We are currently already removing “offending” >1.0 objects as you suggest by setting them down to a value of 1.0 in downstream python scripts.
Your suspicion is entirely correct; the implementation is here. We will certainly look into it ourselves and see if the convex hull area implementation needs to be updated and try to identify how we’d better handle it. We’ll also plan to add a mathematical test to cover precisely this issue in the future.
\O/ Thank you so much!!!