Cell Density Map

Hi,

I’m looking for a way to compute cell density map, based on already segmented cells (in 2D). My goal is to visually display the density map and quantify the location in the tissue with high concentration of the specific cells, eg based on distances from other objects.

@ThomasBoudier: How does 3D Density of 3D ImageJ Suite works? What exactly is it calculating? What do the different parameters control: Radius, NbNeighbors?
Should I use it with labeled image OR binary image of objects OR with an image of cell-centers (generated by ultimate points)?

Is there another Fiji plugin that calculates local density of objects?
@petebankhead: Is there a way to do it in QuPath?

@haesleinhuepf: any way to do this in CLIJ?

Thanks
Ofra

1 Like

Hey @Ofra_Golani,

would you mind sharing an example image? I’m asking because I wonder if you have a label map, a binary image and/or centroid positions in a table…
I would define density as number of cells per area or rather number of center pixels per area… How would you define it based on distances? Average distance to touching neighbors or average distance of the closest 5 for example? If so, I’m happy to help in case you try #clij This notebook could be a starting point. All shown operations also do their job in 2D :wink:

Cheers,
Robert

2 Likes

Hi @Ofra_Golani, here’s a QuPath script that will generate an ImageJ image as a starting point:

Basically, after defining the resolution of the map, each pixel in the map should give the number of centroids falling within that pixel. You can then proceed with filtering it (e.g. with Mean or Gaussian filters) depending upon how you want to define the density.

(The entire map could also be generated in a slightly longer script… but I’ve written such scripts a few times now, losing them each time, which suggests it really ought to be integrated directly into QuPath sooner rather than later.)

4 Likes

Hi @Ofra_Golani,

The details of the computation are here. Basically, for each pixel we look for the NbNeighbors closest neighbors and sum their Gaussian contribution based on their distance.

Best,

Thomas

3 Likes

Thanks @haesleinhuepf,

Indeed I would define the density as the number of cells per area or number of center pixels per area, so Count Non Zero Pixels in a Sphere is a good solution, maybe with additional smoothing step. I did encounter strange edge effect when using Count Non zero Pixels in a sphere (2D).

Here is a cropped labeled image after removing label boundaries LabelImage_nottouching.tif (6.0 MB), the related ultimate points image LabelImage_UltimatePoints.tif (3.0 MB) . The output of Count Non Zero Pixels (radius=100 pixels) is: count_non_zero_pixels.tif (3.0 MB)

Thanks
Ofra

2 Likes

Hey @Ofra_Golani,

I’d say the edge effect comes from labels touching the image border.

Could you use excludeLabelsOnEdges before analysing the dataset?

Cheers,
Robert

2 Likes

Thanks @ThomasBoudier

So for my case I should use it with Radius that reflect the desired size of local environment and with high number of NbNeighbors (>maximal cell count per local environment), right ?

Ofra

Hi @haesleinhuepf

excludeLabelsOnEdges solved the edge effect !

Thanks
Ofra

1 Like

Hi @Ofra_Golani,

Yes you are perfectly right :slight_smile: . To elaborate on the solution by @haesleinhuepf, counting non-zero pixels in an area seems like a linear version of the density map, density map will weight the counting by the distances of these non-zero pixels. But I guess in your case you should get similar results.

Best,

Thomas

2 Likes

Thanks @petebankhead

I’ll try it. Still hesitating which platform to use for this project: Fiji/QuPath or some integration :thinking:
Integrating this into QuPath would be great :slight_smile:

Ofra

2 Likes

Hi @petebankhead

Sorry for reviving an old thread but I thought someone else might like this “hack”: Once you have the count array (fp), you can assign those counts as a new measurement for each cell (e.g. ‘Positive density’) to then display a positive cells density heatmap right inside QuPath (works better with filled cells - f key).

I just added a few lines to your code:

/**
 * Create a 'counts' image in QuPath that can be used to compute the local density of specific objects.
 *
 * This implementation uses ImageJ to create and display the image, which can then be filtered as required.
 * 
 * Written for QuPath v0.2.0.
 *
 * @author Pete Bankhead
 */

import ij.ImagePlus
import ij.process.FloatProcessor
import qupath.imagej.gui.IJExtension
import qupath.lib.objects.PathObjectTools
import qupath.lib.regions.RegionRequest

import static qupath.lib.gui.scripting.QPEx.*
import qupath.imagej.tools.IJTools

// Define the resolution at which the image should be generated
double requestedPixelSizeMicrons = 200

// Get the current image
def imageData = getCurrentImageData()
def server = imageData.getServer()
// Set the downsample directly (without using the requestedPixelSize) if you want; 1.0 indicates the full resolution
double downsample = requestedPixelSizeMicrons / server.getPixelCalibration().getAveragedPixelSizeMicrons()
def request = RegionRequest.createInstance(server, downsample)
def imp = IJTools.convertToImagePlus(server, request).getImage()

// Get the objects you want to count
// Potentially you can add filters for specific objects, e.g. to get only those with a 'Positive' classification
def detections = getDetectionObjects()
def positiveDetections = detections.findAll {it.getPathClass() == getPathClass('Positive')}

// Create a counts image in ImageJ, where each pixel corresponds to the number of centroids at that pixel
int width = imp.getWidth()
int height = imp.getHeight()
def fp = new FloatProcessor(width, height)
for (detection in positiveDetections) {
    // Get ROI for a detection; this method gets the nucleus if we have a cell object (and the only ROI for anything else)
    def roi = PathObjectTools.getROI(detection, true)
    int x = (int)(roi.getCentroidX() / downsample)
    int y = (int)(roi.getCentroidY() / downsample)
    fp.setf(x, y, fp.getf(x, y) + 1 as float)
}

// Send back as measurement...
for (detection in detections) {
    // Get ROI for a detection; this method gets the nucleus if we have a cell object (and the only ROI for anything else)
    def roi = PathObjectTools.getROI(detection, true)
    int x = (int)(roi.getCentroidX() / downsample)
    int y = (int)(roi.getCentroidY() / downsample)
    detection.getMeasurementList().putMeasurement('Positive Density', fp.getf(x, y))
}

and here’s what the measurement map looks like:

Maybe I could do some filtering on fp or play with smooth object features some more. The point is, that gives me a simple way to show a positive cells density heatmap in QuPath, and it’s also really fast to compute.

(Ki-67 slide, ROI2 eyed by a pathologist from the raw image… I know it’s their job, but aren’t they amazing? :wink: )

2 Likes

Yep! One of my first studies involved comparing QuPath’s Ki67 results with a pathologist’s visual estimates of the % positive tumour cells made while looking down a microscope at hundreds of small samples.

In the cases where we reviewed in some detail (and were pretty confident the cell detection & classification looked good), I found the level of agreement pretty astounding… leading me to wonder if/how I could learn superhuman counting skills like that.

Since you brought up the thread and smoothed object features, I figured I would point out that that is how I usually go about looking for these sorts of things (plus the hotspot scripts described elsewhere).

// This script takes the values generated by the Add smoothed features plugin to create a cell density map that can be viewed in Measure->Measurement Maps
// For example, Tumor cells would show up in the Measurement List as "Nearby cells - Tumor"
// Local density will be based on the radius used during Add smoothed features.

//////////CHANGE THIS //////
String smoothedRadius = 15
////////////////////////////

//Alternatively, remove this line and use the Analyze->Calculate Features-> Add smoothed features with Smooth within classes checked.
selectAnnotations()
runPlugin('qupath.lib.plugins.objects.SmoothFeaturesPlugin', '{"fwhmMicrons": '+smoothedRadius+',  "smoothWithinClasses": true}');

classList = getCurrentHierarchy().getDetectionObjects().collect{it.getPathClass()} as Set

//Find the smoothed count


classList.each{c->
    cellList = getCellObjects().findAll{it.getPathClass() == c}
    cellList.each{
        it.getMeasurementList().putMeasurement("Nearby cells - "+ c.toString(), measurement(it,"Smoothed: "+smoothedRadius+" µm: Nearby detection counts"))
    }
    //getCellObjects().findAll{it.getPathClass() != c}.each{it.getMeasurementList().putMeasurement("Nearby cells - "+ c.toString(), 0)}
}
    
print "Done"

While this does require manually entering in the smoothing value, that value could probably also be picked up from the measurement list. I did not want to deal with what might happen if multiple Smoothings were run.

Results end up looking like this with the line near the end left commented out.


And with the line un-commented, you can black out the rest of the cells.

1 Like

Hi @Research_Associate,

I couldn’t get the “smooth object features” dialog or your script to blur just the measurement map I’m interested in (in this case, my newly created “Positive density”). Every measurement map seems to be blurred (and assigned a new name in the process) and this unfortunately can take some time.

Instead, I managed to apply ImageJ’s GaussianBlur() on the floatprocessor fp in-place, before assigning the values back to my “Positive density” map. Basically, two lines strategically placed between fp being assigned the 2-D histogram and the “Positive density” map being constructed:

/**
 * Create a 'counts' image in QuPath that can be used to compute the local density of specific objects.
 *
 * This implementation uses ImageJ to create and display the image, which can then be filtered as required.
 * 
 * Written for QuPath v0.2.0.
 *
 * @author Pete Bankhead
 */

import ij.ImagePlus
import ij.process.FloatProcessor
import ij.plugin.filter.GaussianBlur;
//import ij.plugin.filter.PlugInFilter;

import qupath.imagej.gui.IJExtension
import qupath.lib.objects.PathObjectTools
import qupath.lib.regions.RegionRequest

import static qupath.lib.gui.scripting.QPEx.*
import qupath.imagej.tools.IJTools

// Define the resolution at which the image should be generated
double requestedPixelSizeMicrons = 100
double sigma = 1.5
double accuracy = 0.01

// Get the current image
def imageData = getCurrentImageData()
def server = imageData.getServer()
// Set the downsample directly (without using the requestedPixelSize) if you want; 1.0 indicates the full resolution
double downsample = requestedPixelSizeMicrons / server.getPixelCalibration().getAveragedPixelSizeMicrons()
def request = RegionRequest.createInstance(server, downsample)
def imp = IJTools.convertToImagePlus(server, request).getImage()

// Get the objects you want to count
// Potentially you can add filters for specific objects, e.g. to get only those with a 'Positive' classification
def detections = getDetectionObjects()
def positiveDetections = detections.findAll {it.getPathClass() == getPathClass('Positive')}

// Create a counts image in ImageJ, where each pixel corresponds to the number of centroids at that pixel
int width = imp.getWidth()
int height = imp.getHeight()
def fp = new FloatProcessor(width, height)
for (detection in positiveDetections) {
    // Get ROI for a detection; this method gets the nucleus if we have a cell object (and the only ROI for anything else)
    def roi = PathObjectTools.getROI(detection, true)
    int x = (int)(roi.getCentroidX() / downsample)
    int y = (int)(roi.getCentroidY() / downsample)
    fp.setf(x, y, fp.getf(x, y) + 1 as float)
}

//Here we blur fp. Increase sigma for more blurring.
def g = new GaussianBlur();
g.blurGaussian(fp, sigma, sigma, accuracy);

for (detection in detections) {
    // Get ROI for a detection; this method gets the nucleus if we have a cell object (and the only ROI for anything else)
    def roi = PathObjectTools.getROI(detection, true)
    int x = (int)(roi.getCentroidX() / downsample)
    int y = (int)(roi.getCentroidY() / downsample)
    detection.getMeasurementList().putMeasurement('Positive Density', fp.getf(x, y))
}

And here’s the result:

Super nice, and fast! Once we do some testing, I’ll update this thread if we find better values for requestedPixelSizeMicrons, sigma and accuracy.

Cheers,
Egor

2 Likes

Yes, as far as I know, you would have to temporarily remove the measurement lists to target only certain measurements… which could be rough over millions of cells.

I do like the look of that quite a bit better, though, so I slightly adapted it to create a measurement per full class of object. I suppose it could also be made per ROI or TMA core to allow finer sampling.

/**
 * Create a 'counts' image in QuPath that can be used to compute the local density of specific objects.
 *
 * This implementation uses ImageJ to create and display the image, which can then be filtered as required.
 * 
 * Written for QuPath v0.2.0.
 *
 * @author Pete Bankhead
 */


// Define the resolution at which the image should be generated
double requestedPixelSizeMicrons = 50
double sigma = 1.5
double accuracy = 0.01

classList = getCurrentHierarchy().getDetectionObjects().collect{it.getPathClass()} as Set
// Get the current image
def imageData = getCurrentImageData()
def server = imageData.getServer()
// Set the downsample directly (without using the requestedPixelSize) if you want; 1.0 indicates the full resolution
double downsample = requestedPixelSizeMicrons / server.getPixelCalibration().getAveragedPixelSizeMicrons()
def request = RegionRequest.createInstance(server, downsample)
def imp = IJTools.convertToImagePlus(server, request).getImage()

// Get the objects you want to count
// Potentially you can add filters for specific objects, e.g. to get only those with a 'Positive' classification
def detections = getDetectionObjects()

classList.each{c->
    cellList = getCellObjects().findAll{it.getPathClass() == c}
    // Create a counts image in ImageJ, where each pixel corresponds to the number of centroids at that pixel
    int width = imp.getWidth()
    int height = imp.getHeight()
    def fp = new FloatProcessor(width, height)
        for (detection in cellList) {
        // Get ROI for a detection; this method gets the nucleus if we have a cell object (and the only ROI for anything else)
        def roi = PathObjectTools.getROI(detection, true)
        int x = (int)(roi.getCentroidX() / downsample)
        int y = (int)(roi.getCentroidY() / downsample)
        fp.setf(x, y, fp.getf(x, y) + 1 as float)
    }
    
    //Here we blur fp. Increase sigma for more blurring.
    def g = new GaussianBlur();
    g.blurGaussian(fp, sigma, sigma, accuracy);
    
    
    for (detection in detections) {
        // Get ROI for a detection; this method gets the nucleus if we have a cell object (and the only ROI for anything else)
        def roi = PathObjectTools.getROI(detection, true)
        int x = (int)(roi.getCentroidX() / downsample)
        int y = (int)(roi.getCentroidY() / downsample)
        detection.getMeasurementList().putMeasurement(c.toString()+' Density', fp.getf(x, y))
    }
    
}    


import ij.ImagePlus
import ij.process.FloatProcessor
import ij.plugin.filter.GaussianBlur;
//import ij.plugin.filter.PlugInFilter;

import qupath.imagej.gui.IJExtension
import qupath.lib.objects.PathObjectTools
import qupath.lib.regions.RegionRequest

import static qupath.lib.gui.scripting.QPEx.*
import qupath.imagej.tools.IJTools

2 Likes

Very nice @Research_Associate !

EDIT I’ll come back to this, I think what I described only works with the right combination of dialogs open (Measurement maps) and possibly with “Update map” clicked once to set the min / max values because I don’t do it programmatically. Sorry!

Now add Pete’s code from Problem running script from publication - #6 by Research_Associate to automatically display the measurement map when you run the script :slight_smile:

Fair enough, we may still need to adjust the min/max values of the range and the parameters we talked about, and maybe the colormap… but otherwise, one click cell density map!

Put this at the top:

import qupath.lib.gui.tools.*

Put this at the bottom (taking into account which measurement map you want to display):

// Print the names (just to check which you want)
println MeasurementMapper.loadDefaultColorMaps()

// Choose one of them
def colorMapper = MeasurementMapper.loadDefaultColorMaps().find {it.getName() == 'Viridis'}

// Create a measurement mapper
//def detections = getDetectionObjects()
def measurementMapper = new MeasurementMapper(colorMapper, 'Positive Density', detections)

// Show the measurement mapper in the current viewer
def viewer = getCurrentViewer()
def overlayOptions = viewer.getOverlayOptions()
overlayOptions.setMeasurementMapper(measurementMapper)
1 Like

And here’s the code as promised. It creates a toggle button in the toolbar, which can be clicked and unclicked to show or hide the cell density map. Since the calculation is quite fast, I didn’t try to optimise it further.

As per @Research_Associate suggestion, objectClass really should be user input, to allow checking the density of things other than positive cells (see his example above). So if you right click on the button, a context menu lets you change the detection class and colormap on the fly. Classes are updated each time the density map button is right-clicked to take into account any newly created classes:
image image

If the selected object class is a derived class (e.g. Tumor:Positive), the cells displayed in the map only comprise the parent detections for the selected object class (Tumor:Positive and Tumor Negative).

NOTE For the detection classes, do let me know if this is how you expect it to work. All the colormaps work, but for Jet. No idea what’s going on here…

In the meantime, I will drop this on our unsuspecting colleagues and gather some feedback. Let me know what you think and thanks again for all the help :slight_smile:

// A unique identifier to tag the items we will add to the toolbar
def btnId = "densitymap"

// This is the base64 encoded 32x32 GIF file
String imgString = "R0lGODlhIAAgAIQAMRQOUCWOfZS+NJ/KXPwC/FmLnxRVbKDLu05afUyuZJWOoRBydNDS0ZexvDCidlSulNbKH4x6hPn39BkzadK3t1V0lkymZDw2bHK9pd/UYlqeoy9zhK7CGayWpGy5ajRWeiH5BAEAAAQALAAAAAAgACAABAX+4FYEwaNtm7GM2qIFDvbMQaFNFKHvyLaQGEalgNo0NJpHzGLBIBuA3E4XQQU0BQa2smgwMDGMxeNxBEQUyVSnqFQ2yMCx0KiTHgdHwjM7NCgIawQUFwpEBSMBPy0PATIOeTFHClJrABOIC0aKchpgMAceCWYBFR8RggQRFQgVDUQoKi8we3t5Z5QNgh0frQUGH24jQTF6B8ceZxKnghQArRWGb1oqJLQD2GZ+HR2CF6zBbm4NBtaOGAIQHB5OGhWVO0IaBsAfExUncG8YCQMQ6mXOdFvTAUG+D8EKfNjgaoMCDQ78/eMwAEMpQQ3udRnSZsKHAgo+NMAQigOEDCf+DwT4kOrDPBEFWE240EBYkn7/MmQQgGEBPB2+XCDq5WsDDD0DTE70YOAnAQQGsIgYgiBoiVAekqrLcMCnIFMfEbGKKcJRqGMYPAgQUHGBAkEKALwZwnDIDz2k0tqq6GCgJQQx6TE06skBnhJK9vDD4HTQt5iwyl7xZM3wgTEOKqQaFK5FjQI/uqgs4aCYBTONVU2Ag8IGzBUvGkFyIMrBBzWpeoBWRKRBVyRgwFxm1yhQKgWYfHi+AgNiYVrsFhhPdSF55T4ySEA6ILzL5kGrXTQorcTaAssHEiRo0fS7XBGGySPRUKfYgRnu3m5WgMAljPINTFZCCRY9sNF3ziBERFhPV6igkmEkINKAfqk444ZRTlxBxAkkiHcAAyB999QQBiCCARwr/PLKAQVgIoQu3ylwQQeYNLBAVCIY4Ep/4kTDQAgAOw=="
ByteArrayInputStream inputStream = new ByteArrayInputStream(imgString.decodeBase64())
Image btnImg = new Image(inputStream,QuPathGUI.TOOLBAR_ICON_SIZE, QuPathGUI.TOOLBAR_ICON_SIZE, true, true)

preferredMapperName = PathPrefs.createPersistentPreference("measurementMapperLUT", "Viridis");
preferredClassName = PathPrefs.createTransientPreference("measurementClass", "Positive");

// Here we compute a measurement mapper
def DensityMap(viewer, objectClass='Positive', mapperName='Viridis', requestedPixelSizeMicrons=50, sigma=2, accuracy=0.01)
{
    def imageData = viewer.getImageData()
    if (imageData == null)
        return
        
    def server = imageData.getServer()
    
    // Set the downsample directly (without using the requestedPixelSize) if you want 1.0 indicates the full resolution
    double downsample = requestedPixelSizeMicrons / server.getPixelCalibration().getAveragedPixelSizeMicrons()
    def request = RegionRequest.createInstance(server, downsample)
    def imp = IJTools.convertToImagePlus(server, request).getImage()

    // If we have an object of type "Stroma: positive" then it's a derived class and
    // the total detections should be that of the parent class
    def detections = getQuPath().getImageData().getHierarchy().getDetectionObjects()

    pathClass = getPathClass(objectClass)
    if (pathClass.getParentClass() == null) {
        positiveDetections = detections.findAll {it.getPathClass() == getPathClass(objectClass)}
    } else {
        positiveDetections = detections.findAll {it.getPathClass() == pathClass}
        def filteredDetections = detections.findAll {it.getPathClass().getParentClass() == pathClass.getParentClass()}
        detections = filteredDetections
    }

    // Get the objects you want to count
    // Potentially you can add filters for specific objects, e.g. to get only those with a 'Positive' classification

    // TODO Do we have an annotation selected? we can limit to the cells inside that object (maybe)

    // Create a counts image in ImageJ, where each pixel corresponds to the number of centroids at that pixel
    int width = imp.getWidth()
    int height = imp.getHeight()
    def fp = new FloatProcessor(width, height)

    for (detection in positiveDetections) {
        // Get ROI for a detection this method gets the nucleus if we have a cell object (and the only ROI for anything else)
        def roi = PathObjectTools.getROI(detection, true)
        int x = (int)(roi.getCentroidX() / downsample)
        int y = (int)(roi.getCentroidY() / downsample)
        fp.setf(x, y, fp.getf(x, y) + 1 as float)
    }

    // Here we blur fp. Increase sigma for more blurring.
    def g = new GaussianBlur()
    g.blurGaussian(fp, sigma, sigma, accuracy)

    for (detection in detections) {
        // Get ROI for a detection this method gets the nucleus if we have a cell object (and the only ROI for anything else)
        def roi = PathObjectTools.getROI(detection, true)
        int x = (int)(roi.getCentroidX() / downsample)
        int y = (int)(roi.getCentroidY() / downsample)
        detection.getMeasurementList().putMeasurement(objectClass+' Density', fp.getf(x, y))
    }

    // Choose one of them
    def colorMapper = MeasurementMapper.loadDefaultColorMaps().find {it.getName() == mapperName}

    // Create a measurement mapper
    def measurementMapper = new MeasurementMapper(colorMapper, objectClass+' Density', detections)
    def minValue = fp.getMin()
    def maxValue = fp.getMax()
    measurementMapper.setDisplayMinValue(minValue)
    measurementMapper.setDisplayMaxValue(maxValue)
        
    // Show the images
    //IJExtension.getImageJInstance()
    //imp.show()
    //new ImagePlus(imp.getTitle() + "-counts", fp).show()

    return measurementMapper
}

// Remove all the additions made to the toolbar based on the id above
def RemoveToolItems(toolbar, id) {
    while(1) {
        hasElements = false
        for (var tbItem : toolbar.getItems()) {
            if (tbItem.getId() == id) {
                toolbar.getItems().remove(tbItem)
                hasElements = true
                break
            }
        }
        if (!hasElements) break
    }
}

// Create a Submenu from entries and handle the actions
def MakeSubMenu(itemNames, selectedName) {
    def menuItems = []
    itemNames.each{
        CheckMenuItem item = new CheckMenuItem(it)
        menuItems << item
        if (it == selectedName)
            item.setSelected(true)
        else
            item.setSelected(false)

        item.setOnAction( event -> {
            def viewer = gui.getViewer()
            def overlayOptions = viewer.getOverlayOptions()
            if (overlayOptions == null)
                return
            
            // Here we retrieve the preferred option stored as userdata in the (sub)menu entry
            def preferredOption = event.getTarget().getParentMenu().getUserData()
            def itemString = event.getSource().getText()

            // We check all the items one by one to find which one was selected.
            // It gets a tick and is used to store the entry name into the preferred option
            // Tick is removed from the other entries
            menuItems.each{
                if (it.getText() == itemString) {
                    it.setSelected(true)
                    preferredOption.set(itemString)

                    // User selected an option, so let's toggle the map
                    // Remove if you don't want this behaviour
                    btnCustom.setSelected(true)

                    if (btnCustom.isSelected()) {
                        def className = preferredClassName.get()
                        def mapperName = preferredMapperName.get()

                        measurementMapper = DensityMap(viewer, className, mapperName, 50, 2, 0.01)

                        // Show the measurement mapper in the current viewer
                        overlayOptions.setMeasurementMapper(measurementMapper)
                        overlayOptions.setFillDetections(true)
                        overlayOptions.setOpacity(0.5)
                    }
                } else
                    it.setSelected(false)
            }
        })
    }
    return menuItems
}

Platform.runLater {
    gui = QuPathGUI.getInstance()
    toolbar = gui.getToolBar()

    // First we remove the items already in place    
    RemoveToolItems(toolbar,btnId)

    // Reset the display
    def viewer = gui.getViewer()
    def overlayOptions = viewer.getOverlayOptions()
    if (overlayOptions == null)
        return
        
    //Reset the annotations
    overlayOptions.resetMeasurementMapper()
    overlayOptions.setFillDetections(false)
    overlayOptions.setOpacity(1)

    // Here we add a separator
    sepCustom = new Separator(Orientation.VERTICAL)
    sepCustom.setId(btnId)
    toolbar.getItems().add(sepCustom)    
        
    // Here we add a toggle button
    btnCustom = new ToggleButton()
    btnCustom.setId(btnId)
    toolbar.getItems().add(btnCustom)
    
    // The button is given an icon encoded as base64 above
    ImageView imageView = new ImageView(btnImg)
    btnCustom.setGraphic(imageView)
    btnCustom.setTooltip(new Tooltip("Overlay a density map"))

    // Add the context menu with the colormaps (right click)
    ContextMenu contextMenu = new ContextMenu()

    Menu item1 = new Menu("Class")
    Menu item2 = new Menu("Colormap")

    //Here we set the preference paths as userdata
    item1.setUserData(preferredClassName)
    item2.setUserData(preferredMapperName)

    contextMenu.getItems().addAll(item1, item2)
        
    // Here we regenerate the context menu from getAvailablePathClasses()
    // every time the context menu is clicked.
    contextMenu.setOnShowing((event) -> {
        def classNames = ["Positive","Negative"]+QuPathGUI.getInstance().getAvailablePathClasses()*.toString()
        classNames.unique()

        //Dialogs.showInfoNotification("Custom button", "menu event")
        item1.getItems().clear();
        def classItems = MakeSubMenu(classNames, preferredClassName.get())
        item1.getItems().addAll(classItems)
    });

    def colorMappers = MeasurementMapper.loadColorMappers()*.getName()
    def cmapItems = MakeSubMenu(colorMappers, preferredMapperName.get())
    item2.getItems().addAll(cmapItems)

    // Button context menu and click action
    btnCustom.setContextMenu(contextMenu)
    btnCustom.setOnAction {
        viewer = gui.getViewer()
        overlayOptions = viewer.getOverlayOptions()
        if (overlayOptions == null)
            return
        
        if (btnCustom.isSelected()) {
            def className = preferredClassName.get()
            def mapperName = preferredMapperName.get()
            measurementMapper = DensityMap(viewer, className, mapperName, 50, 2, 0.01)
            // Show the measurement mapper in the current viewer
            overlayOptions.setMeasurementMapper(measurementMapper)
            overlayOptions.setFillDetections(true)
            overlayOptions.setOpacity(0.5)
        } else {
            overlayOptions.resetMeasurementMapper()
            overlayOptions.setFillDetections(false)
            overlayOptions.setOpacity(1)
        }
    }
}

import javafx.application.Platform
import javafx.stage.Stage
import javafx.scene.Scene
import javafx.geometry.Insets
import javafx.geometry.Pos
import javafx.geometry.Orientation
import javafx.scene.control.*
import javafx.scene.layout.*
import javafx.scene.input.MouseEvent
import javafx.beans.value.ChangeListener
import javafx.scene.image.Image
import javafx.scene.image.ImageView

import ij.ImagePlus
import ij.process.FloatProcessor
import ij.plugin.filter.GaussianBlur
import qupath.imagej.gui.IJExtension
import qupath.imagej.tools.IJTools
import qupath.lib.objects.PathObjectTools
import qupath.lib.regions.RegionRequest
import qupath.lib.gui.tools.MeasurementMapper
import qupath.lib.gui.prefs.PathPrefs
import qupath.lib.gui.QuPathGUI

import static qupath.lib.gui.scripting.QPEx.*

EDIT1 Small edit to the code to work when no image is present but the button is clicked.
EDIT2 Added a context menu. Hopefully the class selection works as people expect it to.
EDIT3 Taking into account the parent class for derived classes. Now we need a better way to create the list of classes available in the image. Possibly based on promptToPopulateFromImage() (these lines) rather than relying on getAvailablePathClasses() like I’m doing now.

2 Likes

That is one impressive imgString :slight_smile:

And thanks for including the stackoverflow link. I had no idea at first. Putting it all together now.

1 Like

That is one impressive imgString :slight_smile:

Thanks, I could’ve broken it into nice 80 character wide chunks, but it’s not that bad, is it? Especially compared to the first rocket string I found and used from here. That made me realise that for 32x32 pixel wide icons, 32 colours is most likely enough and GIF files are much less verbose than PNGs, so make shorter base64 encoded strings.

The image itself is just a square region from my one of my Ki-67 slides, with pink pixels added manually to identify the transparent background. I wasn’t sure what else could have suggested “density maps” in the same way :sweat_smile:

image

Thanks again for your help!

1 Like