Does trtexec support multiple groups of inputs?

Description

Hi everyone,
About trtexec, i found the param ‘–exportOutput’ can export an output array

[
  { "name" : "582"
  , "dimensions" : "32x1000"
  , "values" : [xxxxx]
  }
]

So i want to know whether trtexec support multiple groups of inputs or not. If it supports, can anyone show a sample usage?

Thank you.

Hi,
Please refer to the below link for Sample guide.

Refer to the installation steps from the link if in case you are missing on anything

However suggested approach is to use TRT NGC containers to avoid any system dependency related issues.

In order to run python sample, make sure TRT python packages are installed while using NGC container.
/opt/tensorrt/python/python_setup.sh

In case, if you are trying to run custom model, please share your model and script with us, so that we can assist you better.
Thanks!

Thanks for reply.
I have read many pages for my problem, but i even could not find the flag in these guides:

The most detailed usage what i found is how can I Use trtexec Loadinputs · Issue #850 · NVIDIA/TensorRT · GitHub

So if trtexec really supports, can you show me a sample directly?

Thanks.

Hi,

Hope following may help you.
For example, if the input is an image, you could use a python script like this:

import PIL.Image
import numpy as np

im = PIL.Image.open(“input_image.jpg”).resize((512, 512))
data = np.asarray(im, dtype=np.float32)
data.tofile(“input_tensor.dat”)

This will “convert” an image to that .dat file which is basically just a raw binary buffer of datatype fp32.Or if it’s not an image, whatever other data source you use, just load it with numpy, cast it to the correct datatype and shape that TensorRT expects to use as input (usually but not always float32) and write it out with numpy’s .tofile() function as above.Then on trtexec, you can load it like this:

trtexec … --loadInputs=‘input_tensor_1:input_tensor_1.dat,input_tensor_2:input_tensor_2.dat’

In case your query is not answered, could you please give more details.

Thank you.