04_video_dec_trt How to use the pytorch model (.pt) in the sample?

Hello,

Only caffemodel exists as an example model in 04_video_dec_trt

How do I use the pytorch model (.pt)?
Can you show me as an example?

Thank you.

Hi,

The backend inference engine is TensorRT.
TensorRT doesn’t support .pt, please convert it into onnx file format first.
https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html

After that, just replace the caffe parser with the onnx parser and it should work.
https://docs.nvidia.com/deeplearning/tensorrt/archives/tensorrt-713/api/c_api/namespacenvonnxparser.html

Thanks.

1 Like

Hello,

I thought that the pytorch model can be used immediately using tensorRT as follows.

The caffee model looks like this
I have .prototxt, .caffemodel.

In the case of pytorch
What are the extensions for deployfile and modelfile?

Can you show me an example?

./video_dec_trt 2 …/…/data/Video/sample_outdoor_car_1080p_10fps.h264
…/…/data/Video/sample_outdoor_car_1080p_10fps.h264 H264
–trt-deployfile …/…/data/Model/resnet10/resnet10.prototxt
–trt-modelfile …/…/data/Model/resnet10/resnet10.caffemodel
–trt-mode 0

  1. I saw something called trttorch in gtc2020. How do you use this? Do you have a guide?

Thank you.

Hello,
A21864 TRTorch A PyTorch TorchScript Compiler Targeting NVIDIA GPUs Using TensorRT_1601770370484001NQu2 (1).zip (998.6 KB)
Do you have any comments?

  1. I know there are torch2trt and trttorch (introduced on GTC2020, attach file).

  2. I know that trttorch has not been released yet, when will it be released?

  3. Are there any plans to provide a way to directly write the pytorch model in the 04_video_dec_trt sample?

torch2trt URL:
https://github.com/NVIDIA-AI-IOT/torch2trt/blob/master/notebooks/image_classification/conversion.ipynb

Thank you.

Hello, @AastaLLL

I changed nvcaffeparser1 to nvonnxparser and compiled it
It doesn’t seem to recognize the nvonnxparser namespace.


What am I supposed to do?

Thank you.

Hi,

Please check the below file as sample:

/usr/src/tensorrt/samples/common/sampleEngines.cpp

 case ModelFormat::kONNX:
 {
    using namespace nvonnxparser;
    parser.onnxParser.reset(createParser(network, sample::gLogger.getTRTLogger()));

Thanks.

1 Like

Hello, @AastaLLL

Can I use directly tensorflow model(.pb) in 04_video_dec_trt sample?

Thank you.

Hello, @AastaLLL

04_video_dec example only need onnx file as input?

In the samples in /usr/src/tensorrt/samples/python/yolov3_onnx, we know that the engine file whose extension ends in ~.trt is used.

Are you not using tensorRT engine files that end with ~.trt?

Thank you.