Plugin creation for TensorRT

I am going to create a plugin for TF’s SparseToDense operation.

Tensorflow API has the following arguments.

void SparseToDense(const std::vector<std::vector<int>>& indices, const float* values, float default_value, bool value_is_scalar, const RuntimeShape& unextended_output_shape, float* output_data)

When viewed on the ONNX graph, SparseToDense has three inputs.

In plugin’s enque method, the arguments are as follows

int enqueue(int batchSize, const void* const* inputs, void** outputs, void* workspace, cudaStream_t stream) override

How do we know respective inputs for inputs[0], inputs[1] and inputs[2]? Can I just assume, input 0, 1, 2 are from left to right?

Hi @edit_or,
Your query has been noted, please allow me some time to check on this.

Thanks!

Looked at the input order from the node specs in the viewer. The input order is shown as in the image. I just follow that order in my implementation.

Any reply for this?

Hi @edit_or,
I am checking on this.
Will revert soon.
Thank you for your patience.

Hi @edit_or,
You can always query the name of a tensor.
You can take reference from the below link for the same.

Thanks!

Yes thanks for the reply. I’m finding how to query the tensor name in TensorRT.
But found out only in this developer guide as

Inputs and output tensors must also be given names (using ITensor::setName()). At inference time, you will supply the engine with an array of pointers to input and output buffers. In order to determine in which order the engine expects these pointers, you can query using the tensor names.

It will be grateful if you show me how to query in TensorRT.

Hi @edit_or,
You can take reference from the below links
https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#create_network_c


Thanks!