Extract cell detection data from segmented areas/annotations in TMA (IF Images) - Qupath

I am trying to extract immune cell detection data from multiplex stained TMA. I am getting all the detection data like it’s in that screenshot, total numbers together. But I also segmented all the cores into tumor and stromal region through manual annotation. How do I get the detection numbers for those annotated regions (tumor/stroma) separately?

Hi @Arif00002,

Aren’t these present in Show annotation measurements? (Right under Show TMA measurements, which I believe is where your screenshot is from)

2 Likes

Hi Melvin,

In the annotation measurement, I get something like this, detection numbers for each separate annotation.

But I want numbers of detection for each type of annotated area (total area). Meaning total tumor or stromal area containing all the respective annotations. And also these data for each core separately. Hope you got what I am trying to get.

So it sounds like you want to cycle through all of the cores, select all annotations per class within a core
(selectObjectsByClass), “Merge selected annotations”?

That will require a script but sounds fairly straightforward. Though, I see a lot of unclassified annotation objects. How do you expect those to be handled?

Also, the resolveHierarchy() at the beginning of @melvingelbard’s next script will fill in the TMA-Core portion of the Annotation measurements.

Adding in a script that does what I mentioned above. It may be fairly slow compared to @melvingelbard’s script below, and WILL result in the change of your objects, merging all annotations of the same class, per TMA Core, into one object. It currently does process all of your null annotations, though I am not sure if that is desired behavior since the existence of the null annotations was not explained.

print "This may take a while depending on how many annotations and cells are present"
Set classes = []
classes = getAnnotationObjects().collect{it?.getPathClass()?.toString()}
getTMACoreList().each{ core ->
    classes.each{currentClass->
        currentObjectSet = getCurrentHierarchy().getObjectsForROI(qupath.lib.objects.PathAnnotationObject, core.getROI())
        //print currentObjectSet.size()
        currentClassObjects = currentObjectSet.findAll{it.getPathClass() == getPathClass(currentClass.toString())}
        //print currentClassObjects.size()
        getCurrentHierarchy().getSelectionModel().setSelectedObjects(currentClassObjects, null)
        mergeSelectedAnnotations()
    }
}
resolveHierarchy()
print "Done"
1 Like

Or if you just want to count the total amount of detections separated by the PathClass of the parent annotation without modifying your objects, I think this is what you want?:

resolveHierarchy()
getTMACoreList().forEach { tma ->
    def totalDetections = [:]
    tma.getChildObjects().forEach { child ->
        if (!child.isCell()) {
            if (totalDetections.containsKey(child.getPathClass()))
                totalDetections[child.getPathClass()] += child.getChildObjects().size()
            else
                totalDetections.put(child.getPathClass(), child.getChildObjects().size())
        }
    }
    print "TMA (" + tma.getName() + "): " + totalDetections
}

Hi Melvin,

Thanks a million for your help. I have run the script. The first problem I am facing is I am only getting the number of detection for only on type of annotation or class. For the following image I am only getting number of detection in stroma(actually total area). but the tumor region I annotated (Tumor) as a descendant object of the stromal area, its not getting detected.

147512173_10226019902588207_8128397680517444719_n

And another major issue is I am only getting the cell detection values, but if I use object classifiers for different cells types (in my case CD4, CD8), not getting those detection information. Like we get in the annotation section of qupath. 147684710_10226019902508205_9084119693519972949_n

I am guessing it needs more scripting? It will be of great help if you can offer a solution for this. Many thanks in advance.