Triangulate for maDLC?

Background

I’m trying to convert my 2D maDLC project to 3D by adding a second camera. I followed all the steps in the documentation DeepLabCut/Overviewof3D.md at 736bf425e11ba8858e75e9962002a9a28718e367 · DeepLabCut/DeepLabCut · GitHub.

Problems

Everything works fine until the last step deeplabcut.triangulate(). When I set filterpredictions to True, it gives me a h5 does not exist error. When I set filterpredictions to False, it gives me list index out of range error. I guess this is because the analyze_video function for maDLC does not generate an h5 file?

Does 3Deeplabcut supports maDLC or should I try single animal version instead?

Any suggestion is appreciated.

3D works with single animal Deeplabcut.

deeplabcut.triangulate was actually patched some months ago to process multi-animal data https://github.com/DeepLabCut/DeepLabCut/commit/daac6d38978961f02a182e55e0e41588a3162af9. If you complete the final maDLC steps (by cross-validating, converting your detections to tracklets and refining them—at which point you’ll get the h5 files), I’d expect triangulate to work fine :slight_smile:

2 Likes

Thank for your response. Glad to know that! I performed all the maDLC steps, and then put the h5 files from the two 2d projects under the same directory. But triangulate still gives me file not found error. I’m attaching the error message and also a screenshot of the directory of where my h5 files are.

I guess it’s because it doesn’t recognize the tracktype (bx) in the h5 file name, which is the only thing different between single and multiple animal version? How am I able to pass this information to triangulate? Thank you.

triangulate_error_message.txt (4.8 KB)

Also, when viewing the source code, I notice that my version of triangulate is not the same as the patched version. How am I able to update to the patched one? I’m already using 2.2b8.

Answering my own question, switching back to version 2.1.10.2 gets you the patched version of triangulate.

I manually changed my h5 file name so I can pass the h5 file not found error. Now I’m having a value error at the undistorting step. I guess this is due to the function does not recognize the fact that there are two individuals in the h5 file, so the dimensions do not match?

Below is the error message:

ValueError Traceback (most recent call last)
in ()
----> 1 deeplabcut.triangulate(config_path3d, video_folder, videotype=VideoType)

3 frames
/usr/local/lib/python3.7/dist-packages/deeplabcut/pose_estimation_3d/triangulation.py in triangulate(config, video_path, videotype, filterpredictions, filtertype, gputouse, destfolder, save_as_csv)
300 path_stereo_file,
301 ) = undistort_points(
→ 302 config, dataname, str(cam_names[0] + “-” + cam_names[1])
303 )
304 if len(dataFrame_camera1_undistort) != len(dataFrame_camera2_undistort):

/usr/local/lib/python3.7/dist-packages/deeplabcut/pose_estimation_3d/triangulation.py in undistort_points(config, dataframe, camera_pair)
498 scorer_cam1, bp, “likelihood”
499 ] = dataframe_cam1.xs(
→ 500 [bp, “likelihood”], level=[“bodyparts”, “coords”], axis=1
501 ).values
502

/usr/local/lib/python3.7/dist-packages/pandas/core/indexing.py in setitem(self, key, value)
668
669 iloc = self if self.name == “iloc” else self.obj.iloc
→ 670 iloc._setitem_with_indexer(indexer, value)
671
672 def _validate_key(self, key, axis: int):

/usr/local/lib/python3.7/dist-packages/pandas/core/indexing.py in _setitem_with_indexer(self, indexer, value)
1726 if len(ilocs) != value.shape[1]:
1727 raise ValueError(
→ 1728 "Must have equal len keys and value "
1729 “when setting with an ndarray”
1730 )

ValueError: Must have equal len keys and value when setting with an ndarray

Answering my own question. Yes, the individuals is the problem. I’ve edited the triangulate function to make it process multi-animal h5 files correctly. Now it can run without error.

1 Like

HI @tzeriver @jeylau – this should work without any hacks, but if it does not, @tzeriver want to open a pull request on the repo for your patch?

Sure, I’ll do that. I’ll need to refine it a little bit then. Thanks.