Inference - export to onnx problem

I’m trying to export DepthAnything v2 to onnx, since I know it’s possible, and just want to learn how it works.
I can get something that works at 518x518, but whatever I try to allow for various resolutions fails.
I can get an output, but it is wrong.

Does anyone know who exported the onnx that does work in Flame?


1 Like

Can you clarify what you mean with export to ONNX?

I assume you’re writing python code to convert a different model snapshot into an ONNX model?

1 Like

yes, pytorch export, to go from pth to onnx