C++ made '.trt' model feed into Python TRT API

Description

Hi,

Currently I have a C++ TRT pipeline which converts a PyTorch model to a TRT model. The saved model has the ‘.trt’ extension which can be used to run inference in a C++ environment.

Now I am wondering if a ‘.trt’ model made in C++ could be fed into the Python TRT API, and if so how?

Thanks in advance,

Michiel

Environment

TensorRT Version: 7.0.0
GPU Type: 2070 Super
Nvidia Driver Version:
CUDA Version: 10.2
CUDNN Version: 7.6
Operating System + Version: Ubuntu 18.04
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable): /
PyTorch Version (if applicable): 1.6.0
Baremetal or Container (if container which image + tag): /

Hi @michiel,
I think it should work. Just saved the serialized model generated in C++ and deserialize using python API.
But model will work on same GPU using which serialized model is generated .

Thanks!

Hi @AakankshaS,

This worked, thanks for the prompt reply!

Cheers,

Michiel