How to build caffe2 on TX2 with tensorRT?

I followed the guide Install | Caffe2 but always got an error told me the onnx-tensorrt built failed.

what should I do to build from source succefully?

Caffe2 fails to build with TensorRT on the Jetson TX2 · Issue #8167 · pytorch/pytorch · GitHub says ‘Caffe2 supports integration with TensorRT 3.0.’, what does this mean? can we use tensorRT to accelerate my detection framework in caffe2?

Hi,

The script is not up-to-date.

Could you check this topic if helps?
[url]https://devtalk.nvidia.com/default/topic/1042821/pytorch-install-with-python3-broken/[/url]

Thanks.