TensorRT Docker to run Pytorch/Tensorflow/Onnx model on NVIDIA Quadro M2200 series

Dear Sir,

I am trying to find the way to run the model based on PyTorch/Tensorflow/Caffe/Onnx over PC(Linux) using TensorRT over my PC(Linux) GPU’s series Quadro M2200

Query 1: I found the Docker concept of running the model, can you please confirm if this is the correct way to run the models directly over PC(Linux) NVIDIA drivers.

Query 2: Will it be possible to run the same model using the docker image for the Jetson Xavier AGX development kit?

Let me know from where I should start and provide the guide.

Thanks and Regards,
Vyom Mishra