Dividing Annotations Based on Area

I’m wondering if there is script available to divide annotations in half or in quarters so that each new annotation has an approximately equal area? For example, after using the wand tool around a tissue sample, is it possible to divide that one annotation into more than one annotation of equal area?

The easiest way to divide up an annotation is the tiling tool, which won’t really do what you want. I don’t know of any scripts that do that, and the only ways to proceed would be a bit complicated.
Find the centroid of the annotation.
Draw lines out from the centroid at X degree increments by angle to divide it into a pie.
Find the intersection of thoese lines with the outer edge of the annotation (doesn’t work if there are holes in the annotation or the annotation is curved).
Build new annotations from the lines and previous annotations.

Not sure how most of that would actually work in practice (coding), but it might be possible. Unfortunately, what you want is pretty arbitrary, not based on data within the image. There aren’t a lot of tools to handle something like that in an image analysis based software, but maybe Pete will some ideas.

Though, technically, based on how you worded your question, you could divide your annotation up into thousands of tiles, and classify them as 1/2/3, and merge all of the 1s, 2s, and 3s together. That would get you pretty close, but there would be no guarantee the resulting area annotations would all actually be touching each other. The area sum would likely be pretty close .

I’ve no idea how to do that conceptually, irrespective of how it would be implemented in QuPath… @pgbrodeur can you explain why exactly you want to do this?
Tiling is the closest QuPath currently has.

Thanks for your response, both. I am evaluating tissue for count of apoptotic cells and want to then put the data into GraphPad Prism to create an analysis, but this software requires to enter at least 3 different values per tissue sample to show error bars etc. So I’m unable to use one annotation as that will give me only one value to input.

It wouldn’t be necessary to have the annotation divided equally by area but I thought that might be easier from a code perspective. Basically, I would just need at minimum three, contiguous annotations per tissue sample and it takes loads of time to annotate with the wand right up to the border of a previous annotation.

I’m unfamiliar with the tiling technique and can’t find it in any of the commands. If this is the best way for me to move forward, would one of you be able to point me in the direction as to how to begin using the tiling tool? Thanks!

Random thought, find Xmin and Xmax of the annotation bounding box, divide into N distances, create a set of full image height rectangles of the right width. Intersect those with the drawn annotation, all scripted. End up with horizontally segmented. Cant do it right now, at dinner, but seems scriptable.

Also this:

Slightly more information

If you’re using a recently v0.2.0 milestone, you can use Ctrl + Shift. If you continue drawing with the brush/wand it should remove overlapping annotations.

That said, @Research_Associate’s suggestions might be closer to what you want. Although it’s not clear to me what the error bars would mean in the final plots (should this somehow relate to heterogeneity?).

Thanks both, I have made my analysis work with the tiling feature so I appreciate the help.

I have a follow-up unrelated question - I’m working with another co-investigator and we have put our tissue samples slides onto an external hard-drive. I create projects on the external hard drive (where all the data for the images sit), make annotations, and save the project. When I hand the hard-drive off to the co-investigator, they are unable to open up images that I have made annotations on. Any thoughts on how to solve this problem? Is it because I am using a mac vs the other person using PC?


It would be helpful to know what version you are using as there have been changes made in this regard. See for example here.

1 Like

Thanks, I believe I’m using v0.1.2 as that was the latest stable release I could get from the online source.

And you are sure that your collaborator is also using 0.1.2? If they have 0.2.0M8, the files will not work at all, as the project structure changed around the middle of the M releases.
*Also, are they opening the project file? I have had some new users try to open the data files and click on those.

1 Like

For v0.1.2 see https://github.com/qupath/qupath/wiki/Projects and for more information there’s a video tutorial here.

Basically, you’ll need to either store your images inside the same directory as your project, or else you’ll need to create a new .qpproj file and edit it in a text editor to specify the file paths for different users if these are not the same.

1 Like

Another possibility for the ITS minded, is that you could change the drive letter on the external drive to the same drive letter as it was on your computer :slight_smile:

Now that I’m playing with a USB drive of my own.

1 Like

Hi both, thanks for the continued support! Having the images stored in the project folder works well.


1 Like

Hi Both,

When using the cell counting feature (the icon with the three circles), if I make a mistake and make a point where I don’t want it to be, is there a way to delete that one point? I see the delete feature will delete all the points I have made for that set.


On PC, Alt+click, but only when you have that points set selected.

Thank you it works, out of curiosity have you achieved good results using the positive cell detection? I’ve tried a few times now after watching Pete’s tutorial videos and changing vectors etc. and I’ve never gotten great results. Just was wondering since counting positive cells one by one takes a while

It depends on the staining. Nuclear DAB staining should be easy and very accurate. Cytoplasmic staining can run into other problems and might require changing the cell expansion or using extra steps for accuracy.. Other stains or combinations of stains could be more challenging, but it always depends on the experiment, image quality, staining quality, etc.

1 Like