Which container shoulf be used for inference of trt files in Jetsons?

I want to run FPEnet on jetson nx.

Which docker container should I download? I have jetpack 4.6.1.


You can use l4t-tensorrt below for the TensorRT inference.

Please noted that this is a runtime container and it’s expected the binary is pre-compiled outside of the container.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.