QuPath ome.tiff export tiling artifact

Sample image showing tiling artifact

I am running the Qupath Stardist on a ome.tiff WSI. It works great. There are no tiling artifacts in the actual segmentation or the raw image. I am now trying to export a mask of this segmentation. I use the following script:

import qupath.lib.images.servers.LabeledImageServer

def imageData = getCurrentImageData()

// Define output path (relative to project)
def outputDir = buildFilePath(PROJECT_BASE_DIR, 'export')
mkdirs(outputDir)
def name = GeneralTools.getNameWithoutExtension(imageData.getServer().getMetadata().getName())
def path = buildFilePath(outputDir, name + "-mask4.ome.tif")


// Convert output resolution to a downsample factor
double downsample = imageData.getServer().getPixelCalibration().getAveragedPixelSize()


// Create an ImageServer where the pixels are derived from annotations
def labelServer = new LabeledImageServer.Builder(imageData)
  .backgroundLabel(0, ColorTools.WHITE) // Specify background label (usually 0 or 255)
  .downsample(downsample)    // Choose server resolution; this should match the resolution at which tiles are exported
  .useCells()
  .addLabel('Tumor', 1)
  .lineThickness(2)
  .setBoundaryLabel('Ignore*', 0) 
  .multichannelOutput(false) // If true, each label refers to the channel of a multichannel binary image (required for multiclass probability)
  .tileSize(512, 512)
  .build()

// Write the image
writeImage(labelServer, path)

Challenges

The image is created but there seem to be tile stitching artifacts. Is it possible to define tile overlap or some other parameter in the LabeledImageSever builder to mitigate this issue?

Thank you in advance.

The artifacts are occurring because of some rounding trouble. It looks you might be upsampling the image for export, and this results in gaps being introduced.

I don’t think this line is what you really want – if you intend to export at the full resolution, use

double downsample = 1
3 Likes

Thank you, Pete. This worked.

1 Like

Hi Pete,

Thanks again for the response. I have a question about how the QuPath polygons are generated from Stardist (or any other cell level) segmentation. Is there a limit to polygon sides for high resolution objects? Or perhaps some label estimation from the polygon when exporting a full labeled image?

This is the output from the above script with downsample=1

Note, the above mask has a boundary of size 2. So there may be some “erosion” but the mask does appear more “pixelated” than what I see in the imageJ mask/polygons (generated from Stardist fiji plugin):

image

Differences between the QuPath & Fiji implementation are summarized here:
https://qupath.readthedocs.io/en/latest/docs/advanced/stardist.html#differences-from-stardist-fiji

In your case, it looks a matter or resolution. At the resolution of the image in the screenshots, I’d expect a boundary of 2 to have a substantial impact in how pixellated things look.

Note also that QuPath permits StarDist to be run at a different resolution, i.e. resampling the image prior to detection. This may be relevant, but probably isn’t the main thing.

It would be easier to compare with the boundaries turned off in Fiji (i.e. only the labelled image) and no boundary class in the QuPath export.