Multi animal identity switch

Hello everyone!

I’m using DLC to track the position of mice in a 3 animal behavior test, where I know the identity of each animal, and label accordingly

So far I’ve been mostly using the default parameters, to optimize later.

I have trained the network with 200 frames, and when I get to tracklet refinement, it happens very very often that the identities are switched, after the animals interact with each other. This is of course a problem, and it has taken me a long time to solve it in the tracklet refinement GUI.

I would like to know if / which parameters are more important for this issue, and that I should start by changing.

Thanks a lot!

HI @amcapaz currently multi-animal does not run this way, it is ONLY for animals you cannot tell apart; the network is trained to not use identity, i.e. it treats each animal you label as a type of animal, thus you labeling them if you can tell them apart is not good for the currently released version. if one can tell them apart, normal DLC is the best. Of course, I don’t know how different your animals look, but right now we have not documented the identity tracking capacity of DLC; when we get out of beta and make our full release, we will have this feature.

Thank you for the reply!
So it is beneficial for the network that we label the animals randomly?
And that can be the reason why the identities are so easily swaped when the animals get close to eachother?
When you say “normal DLC”, you mean the single animal version? With that it is not possible to assess social interaction. My goal is to quantity social interaction in a kind of three chamber test.

with normal DLC, I routinely track multiple animals. If you can tell them apart, you can simply label the following bodyparts: “animal1_nose, animal2_nose, animal3_nose” … "animal1_tail, … "

We hope to get the latest version out in Nov; so I would keep your other project as you labeled it, but the “best way” to use it is not public yet… if that makes sense.

Ok, thank you very much for the help!

Hi Mathis,

Thank you for your answer. We are doing an animal behavior experiment on monkeys and we have a similar issue. I have been using maDLC since the beginning, and it works well in general but I also notice that identity swap happens quite frequently when I’m doing the refine_labels step.

We put collars with different colors around their necks so we can actually tell them apart in the training data. Under this situation, do you suggest we keep the collars and switch to normal DLC (in that case we need to re-label our 200+ frames) or remove the collars in the video and stick to our current maDLC model? Our ultimate goal is to compare the behaviors between two monkeys (for example by plotting their trajectories of body parts). Are we able to do that under normal DLC?

Thank you in advance.

hi @tzeriver our next version will include ID tracking more explicitly, so it can use that info (the collars); as long as you labeled ID1, ID2… consistently

Hi Mathis, Thank you for your response. Glad to know that!

1 Like

Hi Mathis,

We are testing out the new version of deeplabcut recently. The overall tracklets quality is much better than the old version, but we still see identity switches happen sometimes. We are very consistent when labeling the training data according to their collar colors, but it seems that DLC is not using this color information when determining identities?

My understanding is that DLC is doing ID tracking solely based on the affinity score with the last frame, so if there are jumps between frames in the video, then it will lose the track of the identities. Is this the case?

Thank you

glad to hear it’s working well. Of course, not all swaps can be avoided; are you training with identity = true ? Then it used visual features to ID the animal, not just time/tracking.

but also, no:
“racking solely based on the affinity score with the last frame, so if there are jumps between frames in the video, then it will lose the track of the identities. Is this the case?”

We have both a local and a global tracker; see Figure 3 of the paper:

Thanks for the response. After setting identity = true, identity swaps are significantly reduced. :grinning:

Hi Mathis,

Another issue we are facing is, when two animals are overlapping with each other (in our videos this happens quite often), there is a large chance that the tracklets will be mixed together, i.e. one tracklet spreads across two animals. Do you think this can be solved by including more of this kind of frames in our training data, or do you think it’s still a challenge to be solved?

Thank you