Has anyone tried using Triton on Jetson TX2 NX?

The jetpack version I am using is 4.5. Can I start triton server with docker or use it directly on Jetson. When I use this repo (tensorrtx/yolov5 at yolov5-v5.0 · wang-xinyu/tensorrtx · GitHub)to convert to tensorrt engine file, I always get this error [W] [TRT] TensorRT was linked against cuBLAS/cuBLAS LT 10.2.3 but loaded cuBLAS/cuBLAS LT 10.2.2.
How can i fix it,or is there any other way to use Triton server?

Hi,

cuBLAS 10.2.3 is available on JetPack 4.6.x.
Would you mind upgrading your device and giving it a try?

Thanks.

can I start a triton server with docker?

Hi,

For JetPack 4, there are some dependencies between docker and Jetson native OS.
So it’s expected to run the same JetPack version between docker and Jetson.

Thanks.

On this page, ( use docker pull triton on jetson · Issue #3753 · triton-inference-server/server · GitHub ),according to @CoderHam “Triton does not have a docker image for jetson. ” I wonder if this a dead end on jetson?

Hi,

Triton is available on Jetson.

For JetPack 4, our container mounts libraries from the Jetson directly.
It looks like you need the cuBLAS 10.2.3 libraries in the container.
So please set up the device with JetPack 4.6 which contains the cuBLAS 10.2.3, then the container can get the correct version when mounting the libraries.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.