Adjusting propability threshold for pretrained model in Python


I just discovered StarDist for segmentation and I am pretty amazed. I have just been using Fiji and the pretrained “versatile fluo 2D” model and have been very happy with the results once I adjust the probability treshold to around 0.6. Now I want to implent this in Python using Jupyter notebooks as I want to try to use it together with btrack. I can get the pretrained model to work as described in the example notebook but I don’t know how to adjust the treshold to my desired 0.6. My Python knowledge is limited but I couldn’t find the code for the “from_pretrained” function that I use and what variables I could adjust there. It appears stardist will download the config files from the github (zip file). If I change the value in the “threshold.json” file stardist will complain about this zip folder and redownload. So my question is simply how can I adjust the treshold for the pretrained model as in Fiji?

Thanks, great stuff…

Hi @Yannick_Blum,

if you want to use the pre-trained models, you can simply download the GitHub repository, change the threshold value in the json file corresponding to the chosen model, and instantiate your model by pointing directly to the folder containing the json. Looks for example like this:

from stardist.models import StarDist2D

model = StarDist2D(None, name='2D_dsb2018', basedir='stardist/models/paper/' )

Here the stardist folder is the folder downloaded from Github. Like that it won’t “re-download” the files.


Hi @Yannick_Blum, welcome to the forum!

Glad you like it!

You can easily override the default thresholds like this (cf. predict_instances):

from stardist.models import StarDist2D 

model = StarDist2D.from_pretrained('2D_versatile_fluo')

labels, polys = model.predict_instances(normalized_img, prob_thresh=0.6)



Thanks to both of you. I had actually seen the prop_thresh variable in the predict_instances method but got an error when I used it. I later found out that I misspelled as prop_tresh :see_no_evil:. Well, well…

Again great stuff!

1 Like