-
Notifications
You must be signed in to change notification settings - Fork 118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to quantize ViT model with quantization aware training #374
Labels
bug
Something isn't working
Comments
peterjc123
added
bug
Something isn't working
question
Further information is requested
and removed
bug
Something isn't working
labels
Oct 30, 2024
Just noticed that you are not using the Quantized graph rewrite of TinyNN as I can see the following option in your code. |
I modify the QAT setting part, but it appears another error
This is my whole code
|
peterjc123
added
bug
Something isn't working
and removed
question
Further information is requested
labels
Oct 31, 2024
We will take a look. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
It can train the ViT model from the Hugging Face transformer,
but when converting to tflite model it appear an error message that I can't solve it.
The following are the tinynn setting and the error message
Transformers version is 4.26.0
The error message:
The text was updated successfully, but these errors were encountered: