Output DLC model as ONNX model?

Once training is complete in the DeepLabCut system, would it be possible to output the model in ONNX format, and then do inference elsewhere? Perhaps there is some additional code needed at inference time besides simply evaluating a network model, but perhaps its not much? Possible?

I haven’t used this code, but seems there are definitely options from tensorflow (format our model output weights are stored in) to onnx, i.e. https://github.com/onnx/tensorflow-onnx

hope that helps!