Custom TensorRT Container

I have converted an ONNX model to a TensorRT engine. I would like to create a custom docker image for the TensorRT engine with cuda and all its python dependencies. Could you please help me with how to do this?

Hi,
Please refer to below links related custom plugin implementation and sample:
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/sampleOnnxMnistCoordConvAC

Thanks!

Hi @jcdulo,

We suggest you to use TensorRT NGC container comes with all system dependencies required.

Please refer,

Thank you.

1 Like