convert tensorflow pb with slim interface in it to UFF.

Could I convert a tensorflow pb with slim interface in it to UFF using convert_to_uff.py?

Hi,

Please load TensorFlow model with TF module first and convert it into UFF format.

Check here for our support operations:
https://developer.nvidia.com/compute/machine-learning/tensorrt/secure/3.0/rc1TensorRT3-Release-Notes-RC-pdf

TensorFlow Model Conversion

Thanks.

Hi AastaLLL:
Why not freeze the TensorFlow code to .pb and convert it to .uff using the convert-to-uff
utility?

Hi,

Sorry for that we are not familiar with .pb file.

Does this format also contain weight value? Or it only has the model definition.
To export to a UFF model, we need to load the entire model weight to the workspace.

Thanks.

Hi AastaLLL:
the .pb file contain weights value.
I have seen this method in your TensorRT-3-User-Guide , 2.3.2.2. Exporting TensorFlow to a UFF File.

Hi,

Sorry for the missing.
For protobuf to UFF model use-case, please check our convert-to-uff.py sample.

Thanks.

Where can I find the convert-to-uff.py example? It is not available in the TensorRT3.0 tarball.

Hi,

You can find the .py example in our python wheel.
Please noticed that TensorRT python API is only available on the x86-based machine.

Thanks.

Thanks for the response. Where can I find your python wheel? A link would be great!

Edit: Nevermind. I found it. Thanks.

I would like to use it too. Where did you find it?

I would like to use it too. Where did you find it?

When you install the whl file on x86 machine that file is copied in the /usr/local/ folder.

@siddartho9eji did you manage to successfully convert and produce good results from the .pb file?

Thanks!

Hi,

Python API and UFF parser can be installed via

$ sudo apt-get install python-libnvinfer-doc
$ sudo apt-get  install uff-converter-tf

For more details, please check our TensorRT installation guide.

Thanks.