Hello,
Consider that I have a model with all the layers that are supported by TensorRT. The workflow for deploying this model on the DrivePX2 is
Tensorflow model -> UFF model on Host
Copy UFF file to PX2
UFF model -> tensorRT engine on DPX2
Load tensorRT engine using C++ API on DPX2 for inferencing
This is according to
https://devtalk.nvidia.com/default/topic/1030068/driveworks/-solved-tensorrt3-tensorflow-implementation-in-px2/
http://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#convert_model_tensorflow
and the sampleUFFMnist TensorRT sample
I now want to introduce dropout layers to this model. The dropout layers will be used during inference and this layer is currently unsupported by TensorRT.
What would be my workflow for the same?
Thanks,
serinvarghese
Dear serinvarghese,
Can you check the below steps.
Implement the custom layer plugin in C++ as shown in samplePlugin
Use createUffParser to parse UFF file as shown in sampleUffMNSIT
Use parser->setPluginFactory() for custom layers as shown in samplePlugin
There isnt a a function to use Plugins with C++ API with TRT 3.0.2(on PX2), has there been an update after this that allows the same
Dear Dhingratul,
I did not get your question. If you are asking about using plugin layer in tensorRT, Can you please check samplePlugin sample.
I am looking for a “samplePlugin” equivalent for UFF based models. In one of the posts, it is mentioned that there would be a SSD sample with UFF in TRT 4 GA, I am asking if that is available yet ?
Hi, Dhingratul
Okay. Let’s track that in the original topic.
Thanks.
@AastaLLL any progress on either of the bugs ?
AastaLLL
September 10, 2018, 11:15am
#10