TRTEXEC Input Tensor export to Json

I am struggling with the TensorRT C++ api where I am trying to pass a multi dimensional vector into my de-serialized inference engine.

The data seems to pass in, however, no inference occurs. I generated the inference model from an ONNX format using trtexec and want to see what the inference INPUT tensors are for testing performance, as I can only output to file the OUTPUT tensor.

Is there a way to get the input tensors for the trtexec benchmarking tests? Or can someone point me to the place in the code where this occurs?

Any advice is appreciated,

Charles