04_video_dec_trt How to use the pytorch model (.pt) in the sample?

Hello,

Only caffemodel exists as an example model in 04_video_dec_trt

How do I use the pytorch model (.pt)?
Can you show me as an example?

Thank you.

Hi,

The backend inference engine is TensorRT.
TensorRT doesn’t support .pt, please convert it into onnx file format first.

After that, just replace the caffe parser with the onnx parser and it should work.

Thanks.

1 Like

Hello,

I thought that the pytorch model can be used immediately using tensorRT as follows.

The caffee model looks like this
I have .prototxt, .caffemodel.

In the case of pytorch
What are the extensions for deployfile and modelfile?

Can you show me an example?

./video_dec_trt 2 …/…/data/Video/sample_outdoor_car_1080p_10fps.h264
…/…/data/Video/sample_outdoor_car_1080p_10fps.h264 H264
–trt-deployfile …/…/data/Model/resnet10/resnet10.prototxt
–trt-modelfile …/…/data/Model/resnet10/resnet10.caffemodel
–trt-mode 0

  1. I saw something called trttorch in gtc2020. How do you use this? Do you have a guide?

Thank you.

Hello,
A21864 TRTorch A PyTorch TorchScript Compiler Targeting NVIDIA GPUs Using TensorRT_1601770370484001NQu2 (1).zip (998.6 KB)
Do you have any comments?

  1. I know there are torch2trt and trttorch (introduced on GTC2020, attach file).

  2. I know that trttorch has not been released yet, when will it be released?

  3. Are there any plans to provide a way to directly write the pytorch model in the 04_video_dec_trt sample?

torch2trt URL:

Thank you.

Hello, @AastaLLL

I changed nvcaffeparser1 to nvonnxparser and compiled it
It doesn’t seem to recognize the nvonnxparser namespace.


What am I supposed to do?

Thank you.

Hi,

Please check the below file as sample:

/usr/src/tensorrt/samples/common/sampleEngines.cpp

 case ModelFormat::kONNX:
 {
    using namespace nvonnxparser;
    parser.onnxParser.reset(createParser(network, sample::gLogger.getTRTLogger()));

Thanks.