Jetson Nano - Triton Inference Server

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Jetson Nano
• DeepStream Version
Deepstream 5.0
• JetPack Version (valid for Jetson only)
Jetpack 4.4
• TensorRT Version
7.1.0-1+cuda10.2
• NVIDIA GPU Driver Version (valid for GPU only)

Hi,

I have searched previously the forums but I did not find the exact answer to my question. I am sorry if this has already been answered, I am totally new to Jetson devices and Deeptsream.

From what I understand with Deepstream 5.0 the Triton Inference Server is supported:

However, I cannot find the exact documentation for the usage for the Triton Inference server on Jetson devices. I do see that for Jetson Nanon/TX1/TX2 its not supported (below extract from the release notes):

“Triton Inference Server is not supported on Jetson Nano, TX1 and TX2 in this release”

However, for other Jetson devices, where can I find instructions on using the Triton inference server?

Thank you

Svetlana

Here are some reference doc:
DeepStream Triton Inference Server Usage Guidelines

Gst-nvinferserver

Some reference command on running Triton SSD sample on Jetson-NX

1. Install JP4.4DP with DeepStream5.0 on  Jetson-NX
2. Install DeepStream Python 
2.1 downlod deepstream_python_v0.9.tbz2  from https://developer.nvidia.com/deepstream-download
2.2 install ds python 
   $ tar xpf deepstream_python_v0.9.tbz2
   $ cd deepstream_python_v0.9/
   $ tar xpf ds_pybind_v0.9.tbz2 -C /opt/nvidia/deepstream/deepstream-5.0/sources/
2.3 install 
   $ cd /opt/nvidia/deepstream/deepstream-5.0/samples/
   $ ./prepare_ds_trtis_model_repo.sh
2.4 run the sample
  $ cd /opt/nvidia/deepstream/deepstream/sources/python/apps/deepstream-ssd-parser
  $ LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libgomp.so.1 python3 deepstream_ssd_parser.py ../../../../samples/streams/sample_720p.h264

Hi!

Thank you so much for your answer. Is this supported on Jetson Nano?

Best regards

Svetlana

no, as you saw in https://docs.nvidia.com/metropolis/deepstream/DeepStream_5.0_Release_Notes.pdf

“Triton Inference Server is not supported on Jetson Nano, TX1 and TX2 in this release.”

Hi,

Thank you I wanted to confirm…will there be plans to have it work on Jetson Nano or tx2?

Thanks again

Svetlana

DS 5.0 GA will support all these.

AFAIK, DS5.0 DP should support TX2 already.

Thanks!

Thanks so much for your reply

Svetlana