In TensorRT, Is it possible to parse Caffe model using binarized deploy prototxt file instead of plain text prototxt?

Description

I came across a function “parseBinaryProto” in the documentation. Which is used to Parse and extract data stored in binaryproto file. But how can we convert the plain text deploy prototxt file into binaryproto file. Once data in prototxt is extracted, which API can be used to parse the model ? (default parse API needs the plain text prototxt file path as argument, not the extracted data). I am not able to find any usage example online. This will be helpful to hide the model architecture info from the end user. Thank you…

Environment

TensorRT Version: 5.1.5
GPU Type: GeForce GTX
Nvidia Driver Version: 450.119.03
CUDA Version: 10.1
CUDNN Version: 8.0.5.39
Operating System + Version: ubuntu 18.04

Hi,

Looks like you’re using very old TensorRT verison. We recommend you to please use latest version.
UFF and Caffe Parser have been deprecated from TensorRT 7 onwards, hence request you to try ONNX parser.
Please check the below link for the same.

Thanks!

1 Like