TensorRT for aarch64.whl

Description

where can i get python installer TensorRT for aarch64.whl or someone can build whl file for me please? the version i need is tensorrt-8.2.1.8-cp38 Cuda10.2

i have smart cam that installed trt 8.2.1.8 for converting onnx model to TRT , when i want to running the model in python i got error like this

[09/25/2023-02:38:52] [TRT] [E] 1: [stdArchiveReader.cpp::StdArchiveReader::30] Error Code 1: Serialization (Serialization assertion magicTagRead == magicTag failed.Magic tag does not match)
[09/25/2023-02:38:52] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
A clear and concise description of the bug or issue.

Environment

TensorRT Version: tensorrt 8.2.1.8-1+cuda10.2 arm64

Hi,

We recommend you try the latest TensorRT version 8.6.1.
We can download it from here:

Installation document:

If you still face the same issue, please share with us the minimal issue repro ONNX model. scripts and complete verbose logs for better debugging.

Thank you.

I can’t upgrade my TRT because of different architecture of system, the available download of TRT are for linux amd64 and my device are using ARM64, where can i get the arm64 installation ?

We recommend that you please check the shared link.
You can download TensorRT for ARM using the above link.