I’m trying to run INT8 quantization of my model using trtexec, but I’m encountering errors during the calibration step. Below are the steps I followed and the corresponding error logs.
Hi @vic01 ,
I tried reproducing this issue with a similar workflow and it completed successfully without errors. However, my environment is different from yours.
Key observations:
TensorRT version mismatch - You’re using TensorRT 8.6.2.3 while the latest is 10.x. This is likely the main issue. The calibration cache format and INT8 calibration algorithms have evolved significantly between these versions.
Calibration table mismatch error - The error Calibration table does not match calibrator algorithm type suggests that:
Polygraphy (which likely uses a newer calibration algorithm) generated a cache that trtexec 8.6.2.3 can’t properly read
There might be an algorithm mismatch between the cache generation and consumption.
Suggested solutions:
Upgrade TensorRT - If possible, upgrade to TensorRT 8.6.10+ or ideally 10.x on your Jetson Orin. JetPack 6.x should support newer TensorRT versions.
Use consistent tools - Try using only trtexec for both calibration and engine building instead of mixing Polygraphy and trtexec:
Check Polygraphy version compatibility - Ensure your Polygraphy version matches your TensorRT version.
Memory access error - The illegal memory access error might indicate insufficient memory or a bug in TensorRT 8.6.2.3 that was fixed in later versions.
Since this appears to be Jetson/ARM-specific and related to an older TensorRT version, I’d recommend opening a case on the Jetson Forums with:
Your complete error logs
The ONNX model (if shareable) or at least model architecture details
Your calibration data loader script
The generated calibration cache file
The Jetson team will have more insight into platform-specific issues and can verify if this is a known bug in TensorRT 8.6.2.3 for Jetson.
When I install Polygraphy and try to run it, it asks for TensorRT to be installed. For that reason, I create a virtual environment that includes TensorRT 8.6.2.3:
So, in theory, there shouldn’t be any compatibility issues if Polygraphy uses the TensorRT installation available in the same environment.
Or is that assumption incorrect?
Regarding the following suggestion:
How should the calibration script calibrator_script.py be structured or implemented?
So, in theory, there shouldn’t be any compatibility issues if Polygraphy uses the TensorRT installation available in the same environment.
Or is that assumption incorrect?
Even though Polygraphy uses your installed TensorRT 8.6 at runtime, when you run pip install polygraphy, you’re getting the latest version of Polygraphy (likely 0.49.x), which is designed for newer TensorRT versions.
#Check your current Polygraphy version
pip show polygraphy
# Downgrade to match your TensorRT version (or upgrade both)
pip install polygraphy==0.47.1
How should the calibration script calibrator_script.py be structured or implemented?
The issue is most likely a version mismatch. You can continue using your current Polygraphy + trtexec approach, just ensure the versions are compatible and let me know. If the problem persists, please post on the Jetson Forum with your complete logs, as this appears to be Jetson-specific rather than a general TensorRT issue.
The issue was a version mismatch - the version you mentioned does not exist:
pip install polygraphy==0.47.1
I tried instead:
pip install polygraphy==0.48.1
and it worked.
Thanks a lot for the help!
Moreover, could you please provide some guidance or reference on how to implement calibrator_script.py? I’m interested in using trtexec exclusively, without mixing multiple technologies.