Not able to inference multiple input models using TRT


While doing the inference on multi-input models using the snippet below-

saved_model_dir_trt =
root = tf.saved_model.load(saved_model_dir_trt)

where, (load_test_data) is the function which returns a list of preprocessed images to be fed into the
model trained.


TensorRT Version: 7.1.3
GPU Type:
Nvidia Driver Version: 460
CUDA Version: 11.2
CUDNN Version: 8.0.2
Operating System + Version: Ubuntu18
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable): 2.4.0

The below link might be useful for you
For multi threading/streaming, will suggest you to use Deepstream or TRITON
For more details, we recommend you to raise the query to the Deepstream or TRITON forum.