TensorRT for aarch64.whl


where can i get python installer TensorRT for aarch64.whl or someone can build whl file for me please? the version i need is tensorrt- Cuda10.2

i have smart cam that installed trt for converting onnx model to TRT , when i want to running the model in python i got error like this

[09/25/2023-02:38:52] [TRT] [E] 1: [stdArchiveReader.cpp::StdArchiveReader::30] Error Code 1: Serialization (Serialization assertion magicTagRead == magicTag failed.Magic tag does not match)
[09/25/2023-02:38:52] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
A clear and concise description of the bug or issue.


TensorRT Version: tensorrt arm64


We recommend you try the latest TensorRT version 8.6.1.
We can download it from here:

Installation document:

If you still face the same issue, please share with us the minimal issue repro ONNX model. scripts and complete verbose logs for better debugging.

Thank you.

I can’t upgrade my TRT because of different architecture of system, the available download of TRT are for linux amd64 and my device are using ARM64, where can i get the arm64 installation ?

We recommend that you please check the shared link.
You can download TensorRT for ARM using the above link.