Consider that I have a model with all the layers that are supported by TensorRT. The workflow for deploying this model on the DrivePX2 is
- Tensorflow model -> UFF model on Host
- Copy UFF file to PX2
- UFF model -> tensorRT engine on DPX2
- Load tensorRT engine using C++ API on DPX2 for inferencing
This is according to
and the sampleUFFMnist TensorRT sample
I now want to introduce dropout layers to this model. The dropout layers will be used during inference and this layer is currently unsupported by TensorRT.
What would be my workflow for the same?
Can you check the below steps.
- Implement the custom layer plugin in C++ as shown in samplePlugin
- Use createUffParser to parse UFF file as shown in sampleUffMNSIT
- Use parser->setPluginFactory() for custom layers as shown in samplePlugin
There isnt a a function to use Plugins with C++ API with TRT 3.0.2(on PX2), has there been an update after this that allows the same
I did not get your question. If you are asking about using plugin layer in tensorRT, Can you please check samplePlugin sample.
I am looking for a “samplePlugin” equivalent for UFF based models. In one of the posts, it is mentioned that there would be a SSD sample with UFF in TRT 4 GA, I am asking if that is available yet ?
Yes. SSD sample with UFF is available in TRT 4 GA. Please check https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#uffssd_sample
Okay. Let’s track that in the original topic.
@AastaLLL any progress on either of the bugs ?