when i am trying to convert my qat model to onnx i get the following error
RuntimeError:
temporary: the only valid use of a module is looking up an attribute but found = prim::SetAttr[name=“num_batches_tracked”](%128, %155)
Hi,
Do you use PyTorch frameworks?
This is a known issue in PyTorch, and still an open question:
https://github.com/pytorch/pytorch/issues/34002
Thanks.
thank you ,may be is my qat model error,now i have a correct qat model ,but i get the following error:
Unknown type int encountered in graph lowering. This type is not supported in ONNX export.
Is that int8 can’t support convert to onnx ?
convert_to_onnx.py (681 Bytes)
Hi,
This error is caused by the conflict input format of ONNX and TensorRT.
Please check below comment for the details:
Thanks.
This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.