Qupath script with pixel classifier

dear @petebankhead and @Research_Associate,
Could you tell me if it’s possible to export the probability map and if we can have a look or export the decision trees. To understand better how it choose between class 1 and 2.

Thank you

When you save the classifier, it includes a JSON representation of an OpenCV classifier. It can be opened in a text editor; it isn’t very readable, but could potentially be parsed elsewhere.

There is a Save prediction image option under the button when creating/applying a pixel classifier.

1 Like

Dear @petebankhead,
I would like to implement in batch when I use the script editor:

  • pixels classifier for project
  • export a .png with class 1, 2, 3 etc
  • export a .png with the probabilities

Thank you for the JSON representation

This is the closest:

The decision if the classifier outputs classes or probabilities is made when it’s created – there’s no easy way to output probabilities from a preexisting classifier created to output classes.


A huge thank you again for your time. I work in a microscopy platform and a lot of researcher like your software.


Dear @petebankhead, @Research_Associate
A researcher can’t find a way to use your script and receive this message. I think the image are on a server and the project locally. Z:\ is our petabyte.

WARN: Unable to write image
ERROR: IOException at line 10: Unable to write Z:\_SHARE_\Research\CAR\CAR\Oury\Lapin\Palme\Histologie\Hearts\Test quantif AR\prediction.tif!  No compatible writer found.
ERROR: qupath.lib.images.writers.ImageWriterTools.writeImageRegion(ImageWriterTools.java:127)
    qupath.lib.scripting.QP$writeImageRegion$1.callStatic(Unknown Source)
    java.base/java.util.concurrent.FutureTask.run(Unknown Source)
    java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
    java.base/java.util.concurrent.FutureTask.run(Unknown Source)
    java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
    java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
    java.base/java.lang.Thread.run(Unknown Source)

Could you help me ?

Kind regards,

Dear all,
How can I transfer the qupath project locally ?

Once you have moved the image files, you can either break the path to the original files (rename the folder by one letter?), or edit the .qpproj file with a text editor, using find and replace. Breaking the path to the original files will cause QuPath to pop up a dialog when you open the project allowing you to point to the new location.

If you just want to move the project folder around, you should be able to zip it up or just move it, and then deal with the dialog “fixing” the image path after the folder has been moved.

1 Like

I’m really sorry I had a bug with QuPath. I perform a pixel clasifier but qupath crash 3 times this weekend because my memory was full . I recieve a message from windows that the application stop to prevent loss. But I set Qupath with 32 Gb and I have 60 Gb available.
my data set = 22 files between 100Mo and 550 Mo in tiff

This is my script and it’s ok for 20 files but crash after. Should I need to clean the ram between the file or did I have a too big file in the project?

I set the maximum resolution for the classifier with the smallest pixels.

addPixelClassifierMeasurements("collagen", "collagen");

print "Done!";

@HenrikSP reports that clearing the cache works for him – I’d be interested to know if it resolves the problem for you too:

Running classifiers at the maximum resolution can require a lot of memory – although less memory is needed if the output is ‘classification’ rather than ‘probability’.

Do you know what is the uncompressed size for the first image that fails (or any typical image)? This is the value shown under the ‘Image’ tab.

Also any precise error messages/logs you can share would be helpful to understand what may be going wrong.

1 Like

Dear Peter,

  1. The script didn’t crash on the same file because I perform the analysis with different settings
    The first time with all the images and 2 times with a part of the data. I saw that it crashed on the big files.
    I didn’t have any log from Qupath just a message from windows.

  2. I tried your method and it work perfectly 22 files in 7 hours . My configuration Intel Xeon CPU E5-2620 v3 @2.40GHz, 64 GB RAM (I allow Qupath to use 32Gb), Nvidia Quadro K2200, Windows 7 64 bits, Could I increase the speed with the GPU?

  3. Maybe it’s a good idea to add a function for that or to automatically clean the ram when the user use “run for project”.
    the size of the biggest files are 990 MB

  4. Your script to export the classification map don’t work in batch. This script export a single prediction.tif but not multiples prediction.tif
    How can I add the file name in the prediction.tif?

kind regards

I’m afraid not. There are multiple discussions on the forum about this, and it’s mentioned in the FAQs. When I’ve tried using OpenCL with the pixel classifier, it has ended up slower in most cases (and slightly more likely to crash unpredictably).

Did cleaning the RAM help in your cases, or what did you change to get it to work?

Something like this:

def name = GeneralTools.getNameWithoutExtension(getProjectEntry().getImageName())
def path = buildFilePath(PROJECT_BASE_DIR, name + '-prediction.tif')
1 Like

Dear Peter,
As I said the script it’s a bit slow but it’s work perfectly with the cleaning. I was surprise that 32 Gb of RAM are not sufficient for this analysis.

addPixelClassifierMeasurements("collagen", "collagen");

// Try to reclaim whatever memory we can, including emptying the tile cache
javafx.application.Platform.runLater {
print "Done!";

If your image isn’t pyramidal then this could be causing some of the need for extra memory use. Also if your classifier outputs probabilities rather than classifications only.

In any case, View → Show memory monitor can be helpful to see when/where memory use is high.

My files aren’t pyramidal because I preprocessed them.
Maybe you have an idea for that :slightly_smiling_face:
The background of my .ndpi were sometimes blueish, redish, yellowish so I convert the 10x .ndpi in tif via NDPItools and then I use a macro in imageJ to correct the white balance.
Then I upload the tif in Qupath for the pixels classification.

  • Do you know if I can directly modify the white balance sample per sample in Qupath ?
    You can have a look of the data before and after the macro for the white balance.

There are no QuPath commands to correct the white balance (or change pixels generally), but you can

1 Like

Could you explain me why if I use the highest resolution for the pixel classifier it will be faster with a pyramidal OME-TIFF.

Also I think you didn’t see my message, do you know why the macro to export the prediction.tif don’t work on a server? Do you think is due to some protection or limitation of rights?

Sorry for my lack of knowledge.

With a non-pyramidal image, the entire image (+ automatically-generating pyramid) must be held in RAM at the same time. With a pyramidal image, the cache can be used more intelligently to store only what is essential.

That would be my guess, but I don’t know. The ‘no compatible writer’ error can occur becomes of lack of permissions.

1 Like

I perform the tif export everything works fine. please find the report for the memory used.

  1. Do you need a report for the classifier ?

  2. I create a classifier to class the pixel between whole in the tissue, fiber and nuclei. When I export the prediction all the pixels in a rectangle have a class even if its not in the annotation (tissue).
    If I understand correctly its just because the pixels are in the tiles during the analysis but Qupath didn’t measure in this zone. Could you confirm ?

It’s safer not to trust anyone else’s answer to this. I don’t know, because I don’t know exactly what steps you did (and I’m afraid I haven’t time to explore in enough depth reproduce them).

Rather, I’d suggest performing some experiments to find out what has been measured. For example

  • annotate a different area and see if the results change
  • compare the total annotated area with the area of the image vs. the area of the annoation
  • open the prediction in ImageJ and count the pixels of each color for comparison

I think this is almost always the best way to confirm things, to avoid potential misunderstandings (or bugs).