Nvstreammux freezes pipeline, using (batch-size > 1)

Hi,

I’m trying to run a pipline that uses “nvstreammux” with the following config:
g_object_set(G_OBJECT(data.nvstreammux), “live-source”, 1, “width”, 945, “height”, 540, “batch-size”, 1, “gpu-id”, 0, “nvbuf-memory-type”, 0, “num-surfaces-per-frame”, 1, NULL);
The pipeline works fine with (batch-size = 1). However when switching to (batch-size > 1), the pipeline freezes, immediately after switching to “playing” mode

The pipeline was implemented programmatically in C++ (and not by using gst-launch-1.0). These are the pipeline components and order:
rtsp_source → rtp_to_h264 → h264_parse → h264_decode → videoflip → videoconvert → videoscale → queue → nvvideoconvert → capsfilter → nvstreammux → nvvideoconvert → queue → yoloplugin → fakesink

I’ve tried removing all components after the nvstreammux (except for a final appsink block), but the pipeline still seem to freeze.

Any ideas how to solve this?

Thanks,
Yoad

Moving into DeepStream SDK forum for resolution.

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I’ve tried the following pipeline can work well.

gst-launch-1.0 --gst-debug=v4l2videodec:5 rtspsrc location=rtsp://10.19.225.59/media/video1 ! rtph264depay ! h264parse ! queue ! nvv4l2decoder ! nvvideoconvert ! queue ! mux.sink_0 nvstreammux name=mux batch-size=2 width=800 height=640 ! nvmultistreamtiler ! queue ! nvvideoconvert ! queue ! fakesink

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.