TensorRT3: Can I use `giexec` for engines generated by Tensorflow UFF converter?


I created a TensorRT engine file “tf_mnist.engine” as in the official example (https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/topics/topics/workflows/tf_to_tensorrt.html).

I am wondering if I can run the command giexec for this engine (i.e., engines created by python-tensorflow-uff-conversion). From experience, I know that I can run giexec for engines converted from caffe models.

When I tried

giexec --engine=tf_mnist.engine --output=fc2/Relu --batch=1

an error occurs,

engine: tf_mnist.engine
output: fc2/Relu
batch: 1
name=data, bindingIndex=-1, buffers.size()=2
giexec: giexec.cpp:201: void createMemory(const nvinfer1::ICudaEngine&, std::vector<void*>&, const string&): Assertion `bindingIndex < buffers.size()' failed.
[1]    705 abort (core dumped)  ~/TensorRT-3.0.2/bin/giexec --engine=tf_mnist.engine --output=fc2/Relu

How can I set bindingIndex?

Hey there,

I’m trying to use the gixec script to generate a GIE. If the --engine parameter is used to generate an engine, why does it complain with “could not open plan output file Engine could not be created” if that file does not exist? You say that you generated the engine beforehand, is it necessary then?