Triton inference server on jetson xavier

How to run triton inference server on Jetson Xavier NX

Please refer to Deploying Models from TensorFlow Model Zoo Using NVIDIA DeepStream and NVIDIA Triton Inference Server - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums

Hello Abhijith,
You can refer to the 2nd last comment in this thread for building your trtis container on L4T/Jetson
Run on Jetson · Issue #1468 · triton-inference-server/server (github.com)