[discussion / planning] How to move forward with full ONNX support #981
Replies: 3 comments 20 replies
-
Hi Felix, |
Beta Was this translation helpful? Give feedback.
-
@felixdittrich92 I see the docs is updated to export to Onnx, but is the inference/loading working as of now? |
Beta Was this translation helpful? Give feedback.
-
I think we can close the discussion now with https://github.com/felixdittrich92/OnnxTR :) |
Beta Was this translation helpful? Give feedback.
-
Hi @frgfm @charlesmindee and all other 👋,
we will be soon in an state with the library where we are able to export the models into onnx format. (full support planned 0.6)
So i think it would be good to discuss some further work in this topic.
As a user i want:
ocr_predictor
without installing PyTorch or Tensorflow as dependency onnxruntime should be enoughThere are some open question:
ocr_predictor
maybe introducing aonnx_ocr_predictor
makes sense ?push_to_hub
andfrom_hub
?Feel free to add some additional to this list if i missed anything 🤗
Any feedback is very very welcome ❤️
Beta Was this translation helpful? Give feedback.
All reactions