Deep Stream - Dual Model Support

Hello,

I have a AI pipeline where 2 ML Models are used. Output of first model acts as input to second model.

I would like to know if this can be supported in Deep Stream ? Can I put both models in Deep Stream on Jetson Device and the system automatically take care of both models ?

Please help.

Amit

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi,
We would need your help to provide information requested in bold text. Thanks.

Hello,

Details below.

Hardware Platform (Jetson / GPU)
Jetson Nano

• DeepStream Version
Version 5 and above

• JetPack Version (valid for Jetson only)
Version 4.4

• TensorRT Version

• NVIDIA GPU Driver Version (valid for GPU only)
Jetson Nano

• Issue Type( questions, new requirements, bugs)
Question

• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
Use 2 ML models where output of firs model is input to second model

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
Need to know how to work out 2 ML models in sequential manner on Edge device Jetson nano. Where output of first model in input to second model

Hi,
We have demonstration of doing detection in first model and sending the result to second model for doing recognition. Please refer to the architecture of deepstream-app:
DeepStream Reference Application - deepstream-app — DeepStream 6.1.1 Release documentation

Hello,

Thank you for information. You have provided me documentation of Deep Stream.
You mentioned in reply that there is demonstration of “doing detection in first model and sending the result to second model”. Can you please share details of that experiment as well ?

Hi,
There is a sample config file but it is for Xavier platforms/desktop GPUs:

/opt/nvidia/deepstream/deepstream-5.1/samples/configs/deepstream-app/source4_1080p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt

For Jetson Nano, the models may be too heavy. You may replace with light models and give it a try.