[NEUBIAS Academy@Home] Webinar “Tracking cells and organelles with TrackMate” + QUESTIONS & ANSWERS

Hi everyone!

In this thread we are adding all the Questions & Answers we collected during our NEUBIAS Academy webinar “Tracking cells and organelles with TrackMate” (available in Youtube ).

We tried to group questions by category when we could.

There was about a hundred of questions answered by me (Jean-Yves Tinevez) and the three moderators of the webinar: Elnaz Fazeli (@Elnaz), Daniel Sage (@daniel.sage), Robert Haase (@haesleinhuepf) and Jan Eglinger (@imagejan).

Enjoy and post any missing question here!

TrackMate numerical features.

Q1: |s it possible to get velocity and directionality of individual particles? I study intracellular vesicles and these little guys move like hell.

Q2: What is the unit of velocity here?
If your dataset is calibrated correctly, e.g. in microns (space) and seconds (frame delay), your results will be given in microns/second.

Q3: Can you calculate the length of tracks and the residence time of particles using TrackMate?
The track length is already implemented as a track feature. The particle residency time not. You will have to compute this value with another tool (e.g. with MATLAB).

Q4: What is the difference between the two speed we can observe in the results?
The measurements are explained here:

Q5: What is the difference between Track displacement and Total distance travelled?
It is explained here:

Q6: In the results table from “Spots Measurements”, there is a parameter called “Estimated diameter”. How accurate is this measurement and how is it computed, since TrackMate is focused on tracking spots of the size set by the user?
Its accuracy and robustness are limited, therefore its usefulness is limited too.
It is documented here:

Q7: Jean-Yves said initially there is directionality and angle measurement which are quite new now (released yesterday?) The extension on https://imagej.net/TrackMate#Extensions is quite old. Is there another source?
(Jan) The angle features Jean-Yves mentioned are now in https://github.com/tinevez/TrackMate-TrackAnalysis

Q8: How can one measure directionality of tracks moving in 3D tracked in TrackMate? Can it be done only in MATLAB?
I think Jan just updated an extension with this metric. It will be released in the TrackMate-TrackAnalysis extension.

What TrackMate can do.

Q9: |s it working only for lineage or can you do basic tracking of cells (no link between cells, no divisions)?
Yes, various projects have been done where objects of any kind were tracked.

Q10: How large was the largest dataset (number of objects) you applied TrackMate on?
The image dataset must fit in memory of the computer.

Q11: What is the minimum size of particle that TrackMate can reliably identify?
Short answer: few pixels. Long answer: the particles are detected on every frame using filters, typically LoG or DoG filter (Laplacian of Gaussian=LoG, Difference of Gaussian=DoG). You have to tune the parameter (size) of LoG or DoG to the size of the particles.

Q12: Is it possible identify each molecule as well or just single cell ?
Yes, what you detect depends on the detector you choose and its parameters.

Q13: I’m doing FUCCI cell tracking. Do you have some protocols for FUCCI already.
Yes you can track blob (object cells) in multiple channel images. Here, you probably to have a specific post-processing to link the red / green / yellow particles.
Also, I am aware of this set of tools, built by @pmascalchi:

Q14: So will this tracking work in both wide-field and confocal images. Because for transmission images the quality of tracking spots are very jumbled up in Trackmate?
The built-in detectors are made for (confocal or wide-field) fluorescence data where spots are bright. They won’t work on transmitted-light images. You can pre-process your images however.

Q15: |f the image was taking with wide-field microscopy (not confocal not z-stack), what do you put in voxel box?
TrackMate also supports tracking objects in 2D images. Entering the voxel size in Z is not necessary then.

Q16: Is it optimised only for circular blobs?
Yes, the provided detector is optimised for roundish objects, LoG, DoG, … If your objects aren’t roundish, you may have to write a specific detector.

Q17: I guess it works in 3D as well? Does the visualization change?
Yes, all detectors work in 3D. The HyperstackDisplayer shows spots as little dots when they’re outside the current plane, and as little circles when they’re in focus.

Q18: Can you kindly show an example of 3D analysis?
Sorry I missed this. I was stressed by having the presentation and demo running. However you can go through one of the tutorial that does that:

The example image is linked in the tutorial.

Q19: Maybe I missed it , can TrackMate be used on fluorescent spots?
Oh yes, perfect tool for bright (fluorescence) spots (Daniel wrote that).

Q20: So just to follow up on what you just said, if the 3D viewer does not work anymore how do you visualise the tracked spots after 3D tracking?
With the normal Hyperstack window of Fiji. It works fine.

TrackMate limitations.

Q21: Can we track rings (as opposed to dots)? I would like to track micro-droplets. They appear like rings. I was thinking of a Hough transform to detect circles…
No. But your idea is great Adrien. This could be implemented as a new detector module. The question now is who will do it :slight_smile:

Q22: What if i have spots of different size (as they merge the diameter increases up to 10x)?
The detectors we have now do not deal with changing in size a lot. They expect the object size to stay roughly constant over time. Unless someone makes a new detector, TrackMate won’t be the right tool for this situation.

Q23: Are there any limitation in number of cells that are going to track?
Yes. TrackMate becomes sluggish and irresponsive if you have much more than 100k cells.

Q24: In case a dividing object crosses another one in the moment of division, is there a feature that can correct/account for this? Especially with high density of objects/organelles/particles.
This is a typical case that TrackMate does not handle well. High density of fast moving objects actually fails for quasi all trackers. For details check the ISBI Single-Particle Tracking Challenge: Chenouard et al., 2014 https://www.nature.com/articles/nmeth.2808
I also have re-documented TrackMate performance for the datasets of the challenge, this will tell you what you can expect: https://imagej.net/TrackMate_Accuracy

Q25: Do we have options for selecting the different estimated blob sizes at the same time?
No. See questions above.

Q26: Is there any limitation in size of the image. Let’s say several GB?
Image size is limited to available memory of the computer.

Q27: Does TrackMate output the size of the detected spots? Or are all the spots detected considered to be the same size?
The spots are detected by the same filter (usually LoG or Dog) with the same size. All the spots should have almost the same size.

Q28: Can Trackmate work on a binary image?
Actually, the detectors in TrackMate do not work that well with binary images. The results are somewhat disappointing given that the objects have been detected already. This is because the detectors expect to have objects intensities defined over quasi-continuous grayscale values.
I would make ROIs out of the binary image, and use the Python script mention below.

Q29: Thanks for the presentation. Do you think that TrackMate can show colocalization of the vesicles in the future?
I really don’t know. It is really a tool that focuses on tracking single particles.
In the meantime, there is a tool in Icy that just does that by the way:

Q30: Is there a way of checking if tracks are correct one by one besides TrackScheme? Or maybe a methodology for high-dot density images?
No other way that I know of.

TrackMate algorithms.

Q31: Can you explain the mathematics behind? (very shortly).
Robert made a nice presentation about this, linked at the bottom of the 2nd slide. https://www.youtube.com/watch?v=q6-NsNvu81w|

Q32: Does it support/offer supervised-tracking as well?
You mean with Deep-Learning techniques? No.

Q33: Why is the LoG detector the best?
The LoG is an optimal filter for roundish bright object. Mathematical proof in the Fourier domain: the spectral power density of image usually decrease with \omega^2, the LoG (second derivative) acts exactly has a whitening filter. See Sage et al., 2005, Automatic Tracking of Individual Fluorescence Particles: Application to the Study of Chromosome Dynamics, IEEE Transactions on Image Processing

Q34: Could you explain what exactly is computed behind the scenes for “Quality”? How would you define it?
For the LoG and DoG detectors, the image is filtered with a Laplacian of Gaussian or a Difference of Gaussians, respectively. Simply put, “Quality” is the intensity of this filtered image.
Also check https://imagej.net/TrackMate_FAQ#Signification_of_the_Quality_value_in_LoG_Detector.

Q35: Could it be that calculations are mainly CPU-based? Will Robert Haase adapt it to GPU-based processing? : -)
(Robert answering) TrackMate uses the CPU only and is quite fast. Currently no integration of CLIJ is in the making. If you see specific benefits for speeding up processing of specific datasets, feel free to share a dataset on https://forum.image.sc and we will see what we can do. Preprocessing your data on the GPU to make TrackMate life easier is possible for example.

Q36: Can I include direction/Intensity in cost function for tracking?
Yes, the LAP tracker allows to include spot features in the linking cost calculation.
But not ‘direction’ which is a feature that belongs to links, not spots. If you have objects that move with nearly constant velocity, then you want to use the Linear motion tracker which is made exactly for this.

Q37: Can one use direction and/or absolute value of the velocity (possibly separately) to set the penalty for reconnecting two pieces of track, when you have multiple candidates? If yes, does it average on the rest of the track or can it also do it comparing to some frames immediately before?
No you can only use spot features (numerical values that are defined for single spots).
As said above, if you have objects that move with nearly constant velocity, please use the Linear motion tracker, which is made exactly to handle this case.

Q38: Can we “decrease” penalties? If we know intensity IS likely to change for instance.
If you do not explicitly specify a penalty, there is no penalty.

Q39: What does sub-pixel localization do?
It estimates the “true” position (between the sampling grid of pixels) of the intensity maximum. It is doing via quadratic fitting around the maxima position.
The algorithm lives in imglib2. Here is its code:

Q40: But how much is the weight given for distance and intensity for example, evenly split between features or can that be adjusted with sliders, increasing or decreasing feature penalty for individual feature.
It is explained here:

Q41: If a cell or an object/particle has an inconstant/intermittent signal that occasionally goes below the threshold, will the plugin generate a new track every-time that happens? Or can we merge them using extra parameters?
What you describe is a ‘gap’, a ‘hole’ in a track caused by a missed detection. All of the TrackMate trackers have mechanisms that can be configured to deal with them. For instance the gap closing parameters of the Simple LAP tracker. Check this tutorial for information:

Choosing the right algorithms.

Q42: What is the differences in several detection methods and several tracking methods ?
It is difficult to answer to this question in few words. My advice is to download the extensive documentation have written from the Fiji/Trackmate website. You will have full explanation. In short: usually the best trade-off for the detection method is LoG/Dog and for the tracking method is LAP.
Also check this documentation: https://imagej.net/TrackMate_Algorithms#Spot_tracker

Q43: How can we know which is the correct tracker for us?
It’s a great and long discussion. With Sébastien we wrote something about it in a book edited by Kota. Check this, In particular the paragraph 4.5.4:


Q44: Would it make sense to do some processing before (median filter etc)?
You can apply some preprocessing before running TrackMate. However, internally it is doing Laplacian-of-Gaussian or Difference-of-Gaussian filtering. Furthermore, there is a “median” checkbox enabling you to pre-filter your data before detecting spots.

Q45: Does the software offer automated ‘illumination correction’ for those series in which the background intensity or pattern varies from one frame to another?
Fiji supports such operations and you could do it in advance to processing your data. However, the spot detectors used in TrackMate should not rely a lot on background intensity. They search for local maxima (spot detection) and should work independent from uneven background / illumination.

Q46: Can you have selected multiple ROIs?
Not several ROIs at once, but you can create a single composite ROI using the ROI Manager, and then track within this one.

Manual editing.

Q47: Is it possible to select specific tracks and discard them?

Q48: And is it manually possible to join two tracks or break the track into two individual tracks?
Yes absolutely. Check this tutorial: https://imagej.net/Manual_editing_of_tracks_using_TrackMate

Q49: For the manual / semi automatic tracking - would it be possible to go one frame further after one added a spot? We are tracking cells in collagen and often need to add multiple spots per track - going through the frames manually then takes a lot of time as well.
Yes! You can use the F and G shortcut to do so. You can configure how many frames are skipped in the TrackMate tools window (double-click on the TrackMate icon in the Fiji toolbar).
After that you can use Robert action to fill in the gaps: Close gaps by introducing...

Q50: Is it possible to manually correct individual tracks?
Yes, TrackMate has a lot of functionality of manual track curation. On the website, it’s explained in detail.

Q51: How do you link or add missing tracked point between split tracks using the TrackMate tool and on the displayed image instead of the graph panel?
You can add spots and link them with the A and L key on your keyboard.
Check https://imagej.net/Manual_editing_of_tracks_using_TrackMate

Q52: I didn’t get if individuals tracks can be selected and discarded on the interactive window, so “manual curating” of selected tracks.
They can. Check https://imagej.net/Manual_editing_of_tracks_using_TrackMate

Scripting TrackMate.

Q53: Once tracking algorithm well defined, is it possible to script Trackmate?
Yes, you can check the instructions on the tutorial page.

Q54: Can we run it using CLIJ or any other way to run in GPU (or cluster of GPUs)?
TrackMate and CLIJ are both scriptable, e.g. using Jython in Fiji. You could do some preprocessing with CLIJ and visualisation / analysis afterwards.

Q55: Let assume we have already segmented cells. Can we use TrackMate to track the segmented cells directly? Or do we have to create a “spot image” using the result of segmentation?
We made a Python script that does this. Check here:

Q56: If cells have various size, not so round, you can use another tool to make ROIs (StarDist for example) and run the TrackMate tracking part on these ROIs. I’m right?
Yes. See the question just above.

Q57: Can TrackMate be scripted with imageJ macro language?
For scripting Trackmate you need a object oriented programming language such as Jython, Groovy, JavaScript or Java, which are all available in Fijis script editor.

Specific applications.

Q58: Can we tack every single cells in a complex samples such as spheroids during the research?
I have seen a couples of spheroids in which we wanted to track cells. In all my attempts (JY speaking) the image quality was not good enough: even by eye we could not individualize single cells. They looked like a blurry continuum instead of sharp, individualized objects. It should be possible if all cells are well imaged, and if they appear well separated instead of looking fused to their neighbours. But such a quality is very hard to obtain in deep inside spheroids.

Q59: Wow good is TrackMate to track swimming bacteria|?
Here, it depends of your objects (bacteria) and their motility. If your bacteria are roundish objects, not packed, the detector should be able to localize them. If your bacteria don’t swim to fast, the linking should be able to track them!
We have had some good success with S. flexneri bacteria (rod of 2 µm x 1 µm roughly), but the images were of high quality. Here is an example (fixed images however):
Arena et al, 2015, Bioimage analysis of Shigella infection reveals targeting of colonic crypts, PNAS.

Q60: Detect cells on clusters?
Here, it is really the role of the detector to identify the cells, even in the cluster. If your cells are roundish and bright and not too dense, the LoG or DoG can detect one by one all the cells.

Q61: Is TrackMate a good tool to track swimming bacteria in 2D or 3D surface?

Q62: Can we use TrackMate to detect endoplasmic reticulum tubules?
No. From what I know about endoplasmic reticulum tubules, they deviate too much from the round blobs TrackMate is good at detecting.

Q63: Is track splitting into more than 2 objects allowed?
Yes. For instance see what we could do this in this paper that study abnormal cell divisions:
Pospíšilová et al, 2019, The frequency and consequences of multipolar mitoses in undifferentiated embryonic stem cells, JAB
We could use TrackMate to document events as complex as this one:

Q64: Is it possible to draw Kymographs using TrackMate, for distinguishing different tracks on the same path?
Not directly without the tools provided TrackMate. But it is easy to do as a post-processing step using the outputs of TrackMate.

Q65: Can we combine generated tracks with kymographs? For instance, use KymoToolBox to analyze and draw kymographs from the generated tracks.
I don’t know how to do that.

Q66: Can you recommend some tips you may know about tracking qDot-labeled single proteins? Thanks in advance.
Just try it! TrackMate should works with qDot.

Q67: Sorry I haven’t understood. Can’t we track cells that split into smaller/larger cells? Or for instance, could I use TrackMate to track a cell undergoing apoptosis that splits into fragments (and detect these last ones as well)?
If the fragments do not change too much in size it should be ok-ish. But what about trying? :slight_smile:

Q68: How to deal with the debris and artifacts? In my case, some of them have been detected as cells.

  1. If you are lucky the debris don’t have the same size of your particles of interest: in this case the detector won’t them.
  2. if you are lucky, the debris don’t move, you will have a trajectory without no displacement that you can easily remove.

Q69: Can you define the maximum intensity spot that you want to follow inside a Z-stack image?
You can add a filter on spots that will reject spots with an intensity higher than a certain value, yes. This is done in the Filter panel that appears after the spot detection step.
Here is an example:


Q70: How do you export trajectories?
At the very of the TrackMate Dialog, you can export tracks, spots and statistics in XLS, CSV and XML formats. Furthermore, formats compatible with MATLAB and Icy are available.

Q71: Do you have any recommendation to learn MATLAB ? Tutorials or Youtube Channel?
Nothing special. In my case, I have learned MATLAB with its help section.
Actually @simonfn wrote a great introduction to MATLAB in Kota’s book mentioned in this thread:

Q72: We have used MTrackJ to manually track cells in bright-field. Can we import these results into TrackScheme? Or would you recommend tracking bright-field images manually in TrackMate?
Eugene Katrukha write a tool to export FROM TrackMate TO MTrackJ. Unfortunately the other way does not exist yet.
As for tracking cells in bright-field images manually, yes of course. I have done that a lot actually. Sometime the semi-automatic tracking tool will work and accelerate the task.

Q73: Is it possible to run the tracking in Python? Perhaps as a FIJI script?
Yes but it must be the Python version that you can find WITHIN Fiji. See https://imagej.net/Scripting_TrackMate
Otherwise (in ‘native’ Python) no.

Q74: How can we export/keep an image of the color code displayed for the tracks or the spots?
You can use Fijis menu Image > Overlay > Flatten to retrieve a single image with coloured annotations. By the end of the workflow, in the last page of the Trackmate window, there is an action called Capture Overlay that generates a movie of the current view, that you can save to AVI.

Q75: If you have previously segmented cells, can you import the csv file without the gui, in headless mode?
Yes. You need to have the TrackMate CSV importer. Check here, in particular the last paragraph:

Q76: Can one do the detection with some other code and import it in a simple way, then using reconnection, lineages, etc. of TrackMate?
Yes. Probably the best way is to export the tracks in a CSV file and use the TrackMate CSV importer.

Q77: Can we contribute in other languages than in Java? (Matlab, Python, C…)
The extension mechanism only works for Java. But otherwise, look at what we did in MATLAB and Python and KNIME in the interoperability chapter.

Q78: How can you use the AVI file to overlay it with a different channel?
This I don’t know. Could you describe a bit more what you want to achieve?

Q79: I was thinking about how to export the jet colorcode or any other implemented in the visualisation…
The only way I can think of is to make a screen capture of the colorbar in TrackMate GUI.

MaMuT and Mastodon.

Q80: Will Mastodon be compatible with smart scan techniques?
Probably not because I don’t know what it is.|

Q81: Can you talk about MaMuT and Mastodon, please?
I hope it was enough.

Q82: Can you go over MaMuT and Mastodon?
Hello Ellen. A please to meet you there. Is what I did enough?

Particle motility analysis.

Q83: Is it possible to generate MSD graphs and extract average diffusion coefficients directly with TrackMate?
Not directly within TrackMate. You can use scripts that calculate MSD from the script editor in Fiji, or you can use available tools in Matlab or KNIME that do this calculation.
I also made some MATLAB tools for MSD analysis. It is documented here:
With @sebherbert we wrote a full tutorial for this process in a book edited by @Kota https://link.springer.com/chapter/10.1007/978-3-030-22386-1_4

Q84: Is there an easy way when tracking proteins in the cell membrane to calculate/display for example the diffusion coefficients or MSD’s?
Yep, see the question above.


Q85: Is there a link to sample image for File —> Open image. I do not see sample image over there.
Use the menu File > Open Samples > Tracks for Trackmate (almost at the bottom).

Q:86 I could not have version v5.2 although I update my fiji.
It’s probably something with the default update sites. Best way to fix this is to re-download a fresh Fiji.

Q87: Can one save the settings chosen during analysis in a file?
All settings entered in Trackmate are saved when you save the tracking as XML file (bottom of the Trackmate dialog)

Q88: What version is currently running ?

Q89: Using TrackMate on the same .tif file and always with same settings sometimes the number of detected spots varies (usually leaning towards less detected spots). Do you also encounter the same problem? Do you know why would that happen?
No and no. I would need to know more to help you.

Q90: Where is the directionality in the table that he is demoing now?
It’s in a separate add-on you have to copy into your Fiji installation: https://imagej.net/TrackMate#Extensions

Q91: Why there are more than one track for one cell, which are shown in the table of tracks?
Zut, I cannot answer this question offline without you showing me the table in question.

Q92: Is there anything relying on a GPU? Should we invest in good GPU? If so, Nvidia (CUDA)?

Q93: Can you recommend a good resource/repository of images to try out TrackMate with except for the one the one that is on the ImageJ website?
Some datasets for tracking: http://celltrackingchallenge.net