Equal sign in identifyprimaryobjects.py


I’m not sure if this really is a bug or is caused by my specific data and pipeline. But for me it does not make sense…

Since I’m working on bright field images I have to invert the original image to use IdentifyPrimaryObjects. As threshold method I’m using “Binary Image”. This binary image is produced by my own module. For some images I recognized that there are objects identified at positions where my binary image is black. After debugging I found out that this is caused by the following piece of code:


blurred_image is the original image inverted, mask is black, and local_threshold seems to be the inverted binary image (objects black, background white). The original image contains a black frame containing pixels with value 0, so the inverted image, here the blurred_image, contains pixels with value 255. Because of “>=” the new computed binary_image has white pixels at the positions where the original input binary image has not, which leads to the from identified objects.
Vice versa if there would be pixels with value 0 in the background of the blurred_image, which might be possible for flourescence images, and in the input binary image the new binary_image would be white, too, which wouldn’t make sense at all.
The error does not occur, when omitting the equal sign:

Is this a bug or what’s the point in that equal sign?

Thank you and best wishes,

Hi Linda,

Thanks for bringing this issue up, and for taking the time to delve into the code. Would you mind posting your pipeline, an original image and your binary image for a case where this occurs? I think I see what your explanation is getting at, but having a concrete example would be really helpful.


Hi Mark,

I wanted to use the last released version of CP (r11710) to create the pipeline for you because I wasn’t sure, if you would be able to use a pipeline created by the version I’m working on (I’m using the bug_fix branch). Apparently there was no bug in the released version. So I tried the pipeline on the version I’m working on and the error occurred. There are objects identified in the left up corner, while the binary image is completely black in that region.

Here is my pipeline:
bug.cp (5.64 KB)

Here the original image:

And here the binary image:

Unfortunately I wasn’t able to see the code of my released version, since it’s an .app file without source code. But I guess there must be a difference at the piece of code I posted above. It’s part of the run method in identifyPrimaryObjects.py.

I hope this helps!

I think this qualifies as a legitimate bug, but I don’t think it’s as easy as simply removing the equals sign. Since if you did so, and someone had a binary image which was all foreground (all white), the areas where the image was 0 would get identified; basically the reverse of your problem.

I’ve filed this as a bug here: github.com/CellProfiler/CellProfiler/issues/774. In the meantime, if you like, you can download the 2.1 version from our trunk build page (subject to the caveats therein); it appears to work on your images, though I think the fundamental bug is still present.

For the upcoming release, we’ve factored out the thresholding code and the binary image code path never makes it to the code that does the comparison (see below).
If you look at:
binary_image = (blurred_image >= local_threshold) & mask

the local threshold may be a single value or it might be a numpy array giving the threshold at each pixel. The comparison evaluates to an array that is true for every pixel value above the threshold. This is anded with the mask to only pick unmasked, above threshold pixels.

    def threshold_image(self, image_name, workspace, 
        """Compute the threshold using whichever algorithm was selected by the user
        image_name - name of the image to use for thresholding
        workspace - get any measurements / objects / images from the workspace
        returns: thresholded binary image
        # Retrieve the relevant image and mask
        image = workspace.image_set.get_image(image_name,
                                              must_be_grayscale = True)
        img = image.pixel_data
        mask = image.mask
        if self.threshold_scope == TS_BINARY_IMAGE:
            binary_image = workspace.image_set.get_image(
                self.binary_image.value, must_be_binary = True).pixel_data
                workspace.measurements, img, mask, binary_image)
            if wants_local_threshold:
                return binary_image, None
            return binary_image
        local_threshold, global_threshold = self.get_threshold(
            image, mask, workspace)
        if self.threshold_smoothing_choice == TSM_NONE:
            blurred_image = img
            sigma = 0
            if self.threshold_smoothing_choice == TSM_AUTOMATIC:
                sigma = 1
                # Convert from a scale into a sigma. What I've done here
                # is to structure the Gaussian so that 1/2 of the smoothed
                # intensity is contributed from within the smoothing diameter
                # and 1/2 is contributed from outside.
                sigma = self.threshold_smoothing_scale.value / 0.6744 / 2.0
            def fn(img, sigma=sigma):
                return scipy.ndimage.gaussian_filter(
                    img, sigma, mode='constant', cval=0)
            blurred_image = smooth_with_function_and_mask(img, fn, mask)
        if hasattr(workspace,"display_data"):
            workspace.display_data.threshold_sigma = sigma          
        binary_image = (blurred_image >= local_threshold) & mask
            workspace.measurements, img, mask, binary_image)
        if wants_local_threshold:
            return binary_image, local_threshold
        return binary_image

This issue has been resolved in CellProfiler 2.1 and later, new releases of which can now be downloaded from http://cellprofiler.org.