How to build caffe2 on TX2 with tensorRT?

I followed the guide https://caffe2.ai/docs/getting-started.html?platform=tegra&configuration=compile but always got an error told me the onnx-tensorrt built failed.

what should I do to build from source succefully?

https://github.com/pytorch/pytorch/issues/8167 says ‘Caffe2 supports integration with TensorRT 3.0.’, what does this mean? can we use tensorRT to accelerate my detection framework in caffe2?

Hi,

The script is not up-to-date.

Could you check this topic if helps?
https://devtalk.nvidia.com/default/topic/1042821/pytorch-install-with-python3-broken/

Thanks.