IHC quantification CD68 stained surface area

Hello all,

Beginner here! I am trying to quantify the area stained by CD68 in my slides. CD68 will stain macrophages and foreign body giant cells (FBGCs) which are going crazy trying to surround my constructs (white circles in the picutre, they disolve during paraffin embedding). My construct is around 100 µm thick and sometimes it is completely surrounded by FBGCs to the extent that it is not possible to know if it just one giant cell or several. Staining is with NovaRED and not DAB.

I have tried Fiji (manual color deconvolution, then thresholding and measuring the surface area), I also tried the WETA segmentation tool. But I was wondering if there is a way to do it with QuPath in an automated way.

Can you help me? I am still going to take the images, so I can also do tiles, that’s why I am very interested in QuPath.

Kindly find attached some images in Google Drive of several time points. Thank you.
https://drive.google.com/drive/folders/1VoisaECij8DpsOO-aChl1yOrOHWoq4DP?usp=sharing

That doesn’t look too bad, though you have many different ways of characerizing the differences, so I am not really sure what you are looking for. Here are some options.


Cell counting (my settings were not great, but I didn’t have your actual pixel size so ended up being very fudgy).

Positive pixel count

SLICs for percent area, or area measurements (colored by intensity DAB).

Any of those could be automated fairly easily provided the images came with pixel size metadata, and some of them could be done with greater difficulty even without pixel sizes.

Hello Mike,

Thank you very much for your reply. Sorry that I forgot to include the scale. It is 0.086 µm/pixel.

The problem with positive cell counting is that I get a huge variation in size of cells from single macrophages to FBGCs with tens of nuclei, besides the problem of the overlap of cells together. So I believe the positive stained area is the best thing in my case.

The Analyze>Region Identification>Positive Pixel Count seems to be the best. I will need to run this across 10 HPFs per sample, across 120 samples… So automation for me is very important.

Is there a recommendation on how to edit the scale of the image in QuPath because ZenPro (Zeiss) doesn’t save the scale in TIFF files?

For my application, do you recommend that I stick with 0.1.2 or go for 0.2?

Thank you for your help.

0.1.2 or .3 should be fine for that. If you have yet to take your images, I would recommend setting the Acquisition settings in Zen to 8bit color per channel so that you can simply open the CZI files directly.

If you are dead set on exported TIFFs then you will probably want to set yourself up a FIJI macro to adjust the Image->Properties in bulk to add back in the pixel sizes. QuPath does not yet really allow you to do this, although Pete may have a hacky work around.

Oh great. I didn’t know that you can work with .CZI images directly. That’s really handy.

I have another question:
I want to make a ROI, say a rectangle that will be the same size in all of my images, but get prompted before each image is automatically analyzed to move the rectangle to the correct position and then click OK or something. As you have seen in my images, the region with the staining changes, and also if I measure the whole HPF it will be biased. Thank you.

You can use Objects → Specify annotation, of there’s more detail (with a script) at https://petebankhead.github.io/qupath/scripting/2018/03/09/script-create-fixed-size-region.html

Couple of things I didn’t think of in the middle of the night. 0.1.3 and 0.2.0 have a slightly different version of positive pixel detection than 0.1.2. Changes to that are listed here: https://petebankhead.github.io/qupath/2018/03/19/qupath-updates.html
It might be that the newer version will be more stable/helpful for you.

Also I brought up the 8bit, because the default on most Observer systems is 14bit, which will not work by default with QuPath. You can use Zen to down convert old files to 8bit, but the results are pretty terrible and the display settings get really squished. It looks like it treats the 14 bit image as a 16 bit image for the purposes of converting it to a 0-255 range or something like that.

Thank you. I will give it a go, although I still think I will have to manually position the annotation, depending on the direction of my construct.

And BTW, thanks for this awesome application in the first place.

Thanks again. I think I will go with 0.2.0. I had no idea that there was even a 0.1.3.

I hate Zen, but I am stuck with it because in our lab, all the microscopes are from Zeiss.

I made several images yesterday and I will give it a go today and update the post.

There wasn’t really… while ‘between jobs’ I made some minimal-ish changes to v0.1.2 but they were never released as a proper, easily-install-able version. Some people started using them, and in the absence of a better name this started being called v0.1.3.

Since rejoining academia in September last year, my efforts are focused on getting the work back on track with v0.2.0. Remains a work in progress though… lots of changes and improvements, and it’s likely to take a few more months to get it in shape.

Until then, best consider v0.1.2 to be the current stable version and v0.2.0 to be the ‘test it if you’re feeling brave’ version.

Yep, same here! That was why I figured out the 8bit workaround for the Brightfield images. Though there was a Bioformats problem with tiled 8bit czi files that I am not 100% sure if they resolved… I haven’t revisited it in a while. Individual fields of view should be fine to open directly.

I would like to apologize for the delay in my answer. I wasn’t able to get to work in a satisfactory way and because of time restriction, I resorted to doing colour deconvolution with Fiji, threshold and create a mask and then measure the positively stained area within a ROI. I am sure that if I invested more time in QuPath it would have been possible, but I needed the data ASAP.

That’s not the end, I will try to investigate more and will update here, if I was able to get it to work as well as it did with the Fiji script. In the next set of experiments, I will be using images from a scanner, so QuPath will really come in handy.

Thank you @Research_Associate and @petebankhead for everything.

1 Like