Is there anyway to convert tensorrt file to onnx?

I am currently working in C++ and want to convert Tensorrt file to onnx format so i can use it without deepstream. is there any possible solution of converting tensorrt model to other format like onnx which can help me out in c++?

Actually you can load the tensorrt engine file and run inference without deepstream. It is strange for “convert tensorrt file to onnx”. Do you mean “convert onnx to tensorrt file”?

Actually, I want to load the tensorrt engine in c++ without dreamstream but could not find a way for it so I thought maybe there is a way to convert tensorrt file to onnx and i can use onnx file in c++ but I think its wrong idea .

Thanks for the reply

As mentioned above, you can load the tensorrt engine in c++ directly without deepstream.
This is a topic for TensorRT. In TensorRT user guide files, you can find some useful examples.

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.