TRAINED QNN MODEL WITH BREVITAS CAN NOT BE EXPORTED AS ONNX MODEL WITH FINN MANAGER #609
Unanswered
mllearner98
asked this question in
Q&A
Replies: 1 comment
-
Hi, I also encountered this problem today. It can be solved by using torch-1.8.0 instead. Sincerely |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I have trained my model as QNN with brevitas. Basically my input shape is:
I have exported the .pt extended file. As I try my model and generate a confusion matrix I was able to observe everything that I want.
So I believe that there is no problem about the model.
On the other hand as I try to export the .onnx file to implement this brevitas trained model on FINN, I wrote the code given below:
But as I do that I get the error as:
I do not think this is related with the version. But if you want me to be sure about the version, I can check these too.
If you can help me I will be really appreciated.
Sincerely;
Beta Was this translation helpful? Give feedback.
All reactions