Although we converted our TensorFlow H5 model to TensorRT UFF model we do
not have sufficient documentation to inference the model on our own. There’s an
existing tutorial on doing the same using MNIST model as explained in the
documentation, but it isn’t sufficient. How do I run the custom model of TensorRT in my python code please help me out with this
1 Like
Hi,
You can create a TensorRT engine from uff model with this API.
For more detail, you can check the uff_ssd
sample.
Although the model may be different, the inference procedure is roughly the same.
/usr/src/tensorrt/samples/python/uff_ssd/
Thanks.