Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU): GPU • DeepStream Version : 7.0 • TensorRT Version : 8.6.1.6 • NVIDIA GPU Driver Version (valid for GPU only) : 555.97 • Issue Type( questions, new requirements, bugs) : bugs • How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) • Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
When I run deepstream-test5-app with NEW_NVSTREAMMUX=yes and RTSP as the source, I cannot play the saved video.
It seems that filesink cannot save video properly without receiving EOS events.
Source for gst_nvstreamdemux_sink_event in /opt/nvidia/deepstream/deepstream-7.0/sources/gst-plugins/gst-nvmultistream2/gstnvstreamdemux.cpp Checking the code,
there is a process corresponding to GST_NVEVENT_STREAM_EOS,
but no process corresponding to GST_EVENT_EOS.
Could you give me a patch to fix the bug?
For reference, add the corresponding process to GST_EVENT_EOS in nvstreamdemux,
Adding a process to send EOS to as many sources as the number of sources (e.g., 2 sources) to the source pad,
I can play the video.
Thank you.
With DS7.1, the video file is saved correctly.
It seems that the modification of destroy_pipeline in /opt/nvidia/deepstream/deepstream-7.1/sources/apps/sample_apps/deepstream-app/deepstream_app.c has been affected.
If the following modification is made in the DS7.0 source code, the video will be saved successfully.
So it seems that the following change in DS7.1 solved this problem. Right?
--- deepstream-app/deepstream_app.c 2024-04-29 08:25:37.000000000 +0900
+++ deepstream_app.c 2025-06-09 20:29:08.000000000 +0900
@@ -1894,36 +1894,8 @@
if (!appCtx)
return;
- if (appCtx->pipeline.demuxer) {
- GstPad *gstpad =
- gst_element_get_static_pad (appCtx->pipeline.demuxer, "sink");
- gst_pad_send_event (gstpad, gst_event_new_eos ());
- gst_object_unref (gstpad);
- } else if (appCtx->pipeline.multi_src_bin.streammux) {
- gchar pad_name[16];
- for (i = 0; i < config->num_source_sub_bins; i++) {
- GstPad *gstpad = NULL;
- g_snprintf (pad_name, 16, "sink_%d", i);
- gstpad =
- gst_element_get_static_pad (appCtx->pipeline.multi_src_bin.streammux,
- pad_name);
- if(gstpad) {
- /** When using nvmultiurisrcbin, gstpad will be NULL
- * EOS for the pad on pipeline teardown
- * is auto handled within nvmultiurisrcbin */
- gst_pad_send_event (gstpad, gst_event_new_eos ());
- gst_object_unref (gstpad);
- }
- }
- } else if (appCtx->pipeline.instance_bins[0].sink_bin.bin) {
- GstPad *gstpad =
- gst_element_get_static_pad (appCtx->pipeline.instance_bins[0].sink_bin.
- bin, "sink");
- gst_pad_send_event (gstpad, gst_event_new_eos ());
- gst_object_unref (gstpad);
- }
-
- g_usleep (100000);
+ gst_element_send_event(appCtx->pipeline.pipeline, gst_event_new_eos());
+ sleep (1);
g_mutex_lock (&appCtx->app_lock);
if (appCtx->pipeline.pipeline) {
You can port this patch from ds-7.1 to ds-7.0. nvstreamdemux is open source. If nvstreamdemux affects your pipeline, you can migrate the patch to the target version.