Exporting Detection Labels from QuPath

Exporting label masks

Background

  • I would like to export labels per detection for an entire whole slide image (WSI). This script found on the forums works well. It creates a mask and outlines the boundaries between cells. The image above shows the result
    :
import qupath.lib.regions.*
import ij.*
//import java.awt.Color
import java.awt.*
import java.awt.image.BufferedImage
import javax.imageio.ImageIO

// Read RGB image & show in ImageJ (won't work for multichannel!)
double downsample = 1.0
def server = getCurrentImageData().getServer()
int w = (server.getWidth() / downsample) as int
int h = (server.getHeight() / downsample) as int
def img = new BufferedImage(w, h, BufferedImage.TYPE_BYTE_GRAY)


def g2d = img.createGraphics()
g2d.scale(1.0/downsample, 1.0/downsample)
g2d.setColor(Color.WHITE)
for (detection in getDetectionObjects()) {
 roi = detection.getROI()
 def shape = roi.getShape()
 g2d.setPaint(Color.white);
 g2d.fill(shape)
 g2d.setStroke(new BasicStroke(4)); // 8-pixel wide pen
 g2d.setPaint(Color.black);
 g2d.draw(shape)
}

g2d.dispose()
new ImagePlus("Mask", img).show()
def name = getProjectEntry().getImageName() //+ '.tiff'
def path = buildFilePath(PROJECT_BASE_DIR, 'mask')
mkdirs(path)
def fileoutput = new File( path, name+ '-.tiff')
ImageIO.write(img, 'tiff', fileoutput)
println('Results exporting...')

However, I would like to write full resolution OME.TIFF and use labels or multi-channel. It seems something like this is the way to go (https://qupath.readthedocs.io/en/latest/docs/advanced/exporting_annotations.html):

import qupath.lib.images.servers.LabeledImageServer

def imageData = getCurrentImageData()

// Define output path (relative to project)
def outputDir = buildFilePath(PROJECT_BASE_DIR, 'export')
mkdirs(outputDir)
def name = GeneralTools.getNameWithoutExtension(imageData.getServer().getMetadata().getName())
def path = buildFilePath(outputDir, name + "-labels.ome.tif")

// Define how much to downsample during export (may be required for large images)
double downsample = 1

// Create an ImageServer where the pixels are derived from annotations
def labelServer = new LabeledImageServer.Builder(imageData)
 .backgroundLabel(0, ColorTools.WHITE) // Specify background label (usually 0 or 255)
 .downsample(downsample)    // Choose server resolution; this should match the resolution at which tiles are exported
 .useCells()
 .multichannelOutput(false) // If true, each label refers to the channel of a multichannel binary image (required for multiclass probability)
 .build()

// Write the image
writeImage(labelServer, path)

This does not seem to work as I have scripted it. I just get a blank ome.tiff exported. What am I missing here?

Also, can I use the builder to create binary masks with a 1pxl boundary between cells (as shown in above image) instead of labels?

Analysis goals

  • Export a full resolution ome.tiff of cell detection labels/ masks w/boundary.

Challenges

  • Not able to export detection labels using attached script.

Thanks in advance for any guidance.

2 Likes

I am not extremely good with labelServers, but it looks like you should get a completely black image since you aren’t passing it any classes labeled as anything other than background. The code you link shows setting various classes to a given label value, but you have none set.

1 Like

Thanks for the suggestion. I was under the impression I could “.useCells()” instead of the classes:

.useCells()

I classified the cell detections to “tumor” then added the label definition to the script. Still no luck.

import qupath.lib.images.servers.LabeledImageServer

def imageData = getCurrentImageData()

// Define output path (relative to project)
def outputDir = buildFilePath(PROJECT_BASE_DIR, 'export')
mkdirs(outputDir)
def name = GeneralTools.getNameWithoutExtension(imageData.getServer().getMetadata().getName())
def path = buildFilePath(outputDir, name + "-labels.ome.tif")

// Define how much to downsample during export (may be required for large images)
double downsample = 1

// Create an ImageServer where the pixels are derived from annotations
def labelServer = new LabeledImageServer.Builder(imageData)
  .backgroundLabel(0, ColorTools.WHITE) // Specify background label (usually 0 or 255)
  .downsample(downsample)    // Choose server resolution; this should match the resolution at which tiles are exported
  //.useCells()
  .addLabel('Tumor', 1) 
  .multichannelOutput(false) // If true, each label refers to the channel of a multichannel binary image (required for multiclass probability)
  .build()

// Write the image
writeImage(labelServer, path)

This is the image I get (from ifranview):
image

Essentially a blank image. Do I have to account for the parent annotation some how?

Thanks again.

1 Like

Ah, well, you commented out/unselected cells, essentially, so again, it should be black unless you have annotations labeled as tumor. Annotations are the default, it only checks cells for the tumor class if you have that chosen.

*Thanks for including your scripts each time!

Ah, thanks for catching that! Almost there!

import qupath.lib.images.servers.LabeledImageServer

def imageData = getCurrentImageData()

// Define output path (relative to project)
def outputDir = buildFilePath(PROJECT_BASE_DIR, 'export')
mkdirs(outputDir)
def name = GeneralTools.getNameWithoutExtension(imageData.getServer().getMetadata().getName())
def path = buildFilePath(outputDir, name + "-labels.ome.tif")

// Define how much to downsample during export (may be required for large images)
double downsample = 1

// Create an ImageServer where the pixels are derived from annotations
def labelServer = new LabeledImageServer.Builder(imageData)
  .backgroundLabel(0, ColorTools.WHITE) // Specify background label (usually 0 or 255)
  .downsample(downsample)    // Choose server resolution; this should match the resolution at which tiles are exported
  .useCells()
  .addLabel('Tumor', 1)
  .lineThickness(2) 
  .multichannelOutput(false) // If true, each label refers to the channel of a multichannel binary image (required for multiclass probability)
  .build()

// Write the image
writeImage(labelServer, path)

I see that the builder has an option to outline borders:

 .lineThickness(2) 

Is this option available for annotations only? Or can we add this to cell level labels. It does not seem to apply. Here is the output:

Solved. Had to add line to classify boundary to a label. Thanks for the help!

import qupath.lib.images.servers.LabeledImageServer

def imageData = getCurrentImageData()

// Define output path (relative to project)
def outputDir = buildFilePath(PROJECT_BASE_DIR, 'export')
mkdirs(outputDir)
def name = GeneralTools.getNameWithoutExtension(imageData.getServer().getMetadata().getName())
def path = buildFilePath(outputDir, name + "-labels.ome.tif")

// Define how much to downsample during export (may be required for large images)
double downsample = 1

// Create an ImageServer where the pixels are derived from annotations
def labelServer = new LabeledImageServer.Builder(imageData)
  .backgroundLabel(0, ColorTools.WHITE) // Specify background label (usually 0 or 255)
  .downsample(downsample)    // Choose server resolution; this should match the resolution at which tiles are exported
  .useCells()
  .addLabel('Tumor', 1)
  .lineThickness(2)
  .setBoundaryLabel('Ignore', 2) 
  .multichannelOutput(false) // If true, each label refers to the channel of a multichannel binary image (required for multiclass probability)
  .build()

// Write the image
writeImage(labelServer, path)

image