Multi model inference, no streaming without printing!

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU):JETSON
• DeepStream Version:6.3
• JetPack Version (valid for Jetson only):5.1.2
• TensorRT Version:8.4.*

when I Run the multi model inference example code and print the following content after loading two models. As long as it is not printed, the stream will not be pushed. Why is this? How can I solve it from the code? I am promoting the RTMP stream, and the element connections after OSD are nvvideoconvert ->nvv4l2h264enc ->h264parse ->flvmux ->RTMP link.

Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NvMMLiteBlockCreate : Block : BlockType = 261
NvMMLiteBlockCreate : Block : BlockType = 261
NvMMLiteOpen : Block : BlockType = 4
===== NvVideo: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 0
NVMEDIA: Need to set EMC bandwidth : 188000
NVMEDIA: Need to set EMC bandwidth : 188000
NvVideo: bBlitMode is set to TRUE

These are logs for initializing the hardware module. It is normal to push the stream after printing these.

Can I explicitly initialize the above content in the program? When multi-threaded running, the above content will fail to initialize and will not print. Causing my model inference code to malfunction。

No. And that’s not the cause of your problem. You can attach your whole pipeline and logs first. We can start by trying to locate the rootcause of your problem.

1 Like

how can i get the whole logs? and i add the

GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(pipeline->pipeline), GST_DEBUG_GRAPH_SHOW_ALL, “pipeline”);

in my code , Why didn’t I see the generated. dot file

You can add the GST_DEBUG=4 in front of your command.

Please refer to our FAQ.

You can add the 2>&1|tee log.txt behind your command. The log will be recorded in the log.txt file.

<your_command> 2>&1|tee log.txt

anything error in the pipeline?

No. But one branch seems superfluous. There’s a branch where you linked nvstreammux directly to metamux. This is unnecessary.

1 Like

thank you for your reply, how kind of you, it’s so important for me。The last question is, when writing pipeline related code, should we use bin connection or element connection for shorter connection time?

There is not much difference between those. If you use bin, it just encapsulates the element’s connection.

ok,no more questiones,thank you so…o much!