Writing TensorRT plugins for UffParser

Hi, when I learning the TensorRT C++ API, I didn’t find any guides for writing custom plugins and CUDA device functions (.cu files) other than those contained in the plugin files from the TensorRT samples and in this article https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#extending.
In particular, I’m interested in how accurately user-defined functions should copy tensorflow functions (for example, should I completly copy the tensorflow::Scope class to write my own function

tensorflow::ops::TensorArrayRead(const ::tensorflow::Scope & scope, ::tensorflow::Input handle, ::tensorflow::Input index, ::tensorflow::Input flow_in, DataType dtype)

and which arguments this functions should have?).
Perhaps someone can provide some other information that could help?

Hi,

Please refer to below sample for more details:
https://github.com/NVIDIA/TensorRT/tree/release/6.0/samples/opensource/samplePlugin
https://github.com/NVIDIA/TensorRT/tree/release/6.0/samples/opensource/sampleUffPluginV2Ext

Thanks