nvvideoconvert crashes on RTSP input + src-crop=x:y:w:h pipeline

No, it crashes with your command the same way as always on my GTX1050Ti

gdb output

Thread 6 "nvv4l2decoder0:" received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x7fffe0c4b700 (LWP 11161)]
0x00007ffff47c93ca in gst_nvvideoconvert_transform () from /usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libgstnvvideoconvert.so

I use the same GStreamer version

$ gst-inspect-1.0 --version
gst-inspect-1.0 version 1.14.5
GStreamer 1.14.5

Hi,
Please upgrade to DS4.0.1 and try again.
We have verified it on Jetson Nano.

Ok, I’ll try 4.0.1 and get back with results.

Thank you very much for your support.

Docker image 4.0.1-19.09-devel fixes nothing in my situation - still crashes in the same way. 1050Ti and Tesla T4.

Hi shawn,
Please check if you have installed the components:
https://docs.nvidia.com/metropolis/deepstream/dev-guide/index.html#page/DeepStream_Development_Guide%2Fdeepstream_quick_start.html%23wwpID0E0JD0HA
It has to be equal or later versions or some functions may not work properly.

We have verified the link from pavel.shvetsov on Tesla P4:

rtsp://freja.hiof.no:1935/rtplive/definst/hessdalen03.stream

I’m confused - why would Deepstream not be installed in the Deepstream container I downloaded from you? My application works fine IF the following things are NOT true:

I am trying to resize/crop and image on the card
AND
I force the rtsp stream to use TCP instead of UDP.

  • If I use software decoding, everything works as expected;
  • If I can use UDP for the return stream, everything works as expected.

I have no idea what resizing and TCP could possibly have to do with each other, but I didn’t write the code. Missing/unexpected metadata maybe?

Hi,
We run default test-mp4 to launch a rtsp server.
https://github.com/GStreamer/gst-rtsp-server/blob/master/examples/test-mp4.c

Could you share a patch to modify it to run in TCP? Or any other methods to launch a rtsp server in TCP?
Please assist us to reproduce the failure. Thanks.

Hello,

I can confirm the same crash with DeepStream 4.0.1 on Jetson Nano. Nothing changed.

The pipeline is crashing as on my own cameras as on public stream (rtsp://freja.hiof.no:1935/rtplive/definst/hessdalen03.stream), both TCP connections (didn’t try UDP yet).

I’ll also try to assist with test-mp4.c program.

Hi Pavel,
Do you upgrade the system to r32.2.1 and then install DS4.0.1?

Yes, I did that exactly.

UPDATE

To be precise

  1. I've flashed new image to SD card from https://developer.nvidia.com/jetson-nano-sd-card-image-r3221
  2. I've installed DS4.0.1 from https://developer.nvidia.com/deepstream-401-jetson-deb

You do not need to build anything, you simply have to force it to use tcp:

This segfaults (force tcp, resize)
GST_DEBUG=3 gst-launch-1.0 rtspsrc user-id=“user” user-pw=“passwd” protocols=tcp location=rtsp://“a.b.c.d/axis-media/media.amp” ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvideoconvert ! video/x-raw(memory:NVMM),width=320,height=200 ! nvvideoconvert ! fakesink

This does not segfault (force udp, resize)
GST_DEBUG=3 gst-launch-1.0 rtspsrc user-id=“user” user-pw=“passwd” protocols=udp location=rtsp://“a.b.c.d/axis-media/media.amp” ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvideoconvert ! video/x-raw(memory:NVMM),width=320,height=200 ! nvvideoconvert ! fakesink

This does not segfault (force tcp, no resize)
GST_DEBUG=3 gst-launch-1.0 rtspsrc user-id=“user” user-pw=“passwd” protocols=tcp location=rtsp://“a.b.c.d/axis-media/media.amp” ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvideoconvert ! video/x-raw(memory:NVMM) ! nvvideoconvert ! fakesink

The problem is weird and intermittent because the default protocol ordering has tcp as a last resort, which doesn’t happen if you are “network close” to the camera. If you have have two NAT hops though (for example) you are going to timeout UDP after 5 seconds and then segfault when it tries TCP.

Hi,

Here are two ways for you to reproduce the TCP-transport issue with test-mp4.c program from your link:

1. Actually, this program support both UDP and TCP transports out of the box. As shawn.mitchell said you have to force GStreamer to use TCP transport, here is how:

Run test-mp4 server (only MP4 containers are supported)

$ ./test-mp4 input.mp4
stream ready at rtsp://127.0.0.1:8554/test

Try to connect to RTSP stream with GStreamer pipeline

// UDP
// Works with or without "src-crop=0:0:320:200"
gst-launch-1.0 uridecodebin uri=rtsp://localhost:8554/test ! nvvideoconvert src-crop=0:0:320:200 ! nveglglessink

// TCP - notice the "t" in uri, it's a GStreamer feature to force TCP transport usage (rtsp vs. rtspt)
// Works without "src-crop=0:0:320:200"
// Crashes with "src-crop=0:0:320:200" 
gst-launch-1.0 uridecodebin uri=rtspt://localhost:8554/test ! nvvideoconvert src-crop=0:0:320:200 ! nveglglessink

2. If you want to alter the test-mp4 program’s behavior to use only TCP transport for streaming - you have to add one line to source code and recompile the program
Insert line

gst_rtsp_media_factory_set_protocols(factory, GST_RTSP_LOWER_TRANS_TCP);

immediately after

factory = gst_rtsp_media_factory_new (); // line 157

https://github.com/GStreamer/gst-rtsp-server/blob/master/examples/test-mp4.c#L157

Like that

factory = gst_rtsp_media_factory_new ();
/* Set only TCP transport for stream ========================================== */
gst_rtsp_media_factory_set_protocols(factory, GST_RTSP_LOWER_TRANS_TCP);
/* ============================================================================ */
gst_rtsp_media_factory_set_launch (factory, str);

and recompile.

Next, run the new test-mp4 as always

$ ./test-mp4 input.mp4
stream ready at rtsp://127.0.0.1:8554/test

Now the GStreamer pipeline will crash every time you try to crop or resize

// Crash
gst-launch-1.0 uridecodebin uri=rtsp://localhost:8554/test ! nvvideoconvert src-crop=0:0:320:200 ! nveglglessink

// Crash
gst-launch-1.0 uridecodebin uri=rtsp://localhost:8554/test ! nvvideoconvert ! 'video/x-raw(memory:NVMM),width=320,height=200' ! nveglglessink

// Crash
gst-launch-1.0 uridecodebin uri=rtsp://localhost:8554/test ! nvvideoconvert src-crop=0:0:320:200 ! 'video/x-raw(memory:NVMM),width=320,height=200' ! nveglglessink

// No crash
gst-launch-1.0 uridecodebin uri=rtsp://localhost:8554/test ! nvvideoconvert ! nveglglessink

Just in case. Here’s the full source of altered test-mp4.c (I’ve also commented out the audio pad in line 151 because my .mp4 contains no sound stream)

/* GStreamer
 * Copyright (C) 2008 Wim Taymans <wim.taymans at gmail.com>
 *
 * This library is free software; you can redistribute it and/or
 * modify it under the terms of the GNU Library General Public
 * License as published by the Free Software Foundation; either
 * version 2 of the License, or (at your option) any later version.
 *
 * This library is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
 * Library General Public License for more details.
 *
 * You should have received a copy of the GNU Library General Public
 * License along with this library; if not, write to the
 * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
 * Boston, MA 02110-1301, USA.
 */

#include <gst/gst.h>

#include <gst/rtsp-server/rtsp-server.h>

#define DEFAULT_RTSP_PORT "8554"

static char *port = (char *) DEFAULT_RTSP_PORT;

static GOptionEntry entries[] = {
  {"port", 'p', 0, G_OPTION_ARG_STRING, &port,
      "Port to listen on (default: " DEFAULT_RTSP_PORT ")", "PORT"},
  {NULL}
};

/* called when a stream has received an RTCP packet from the client */
static void
on_ssrc_active (GObject * session, GObject * source, GstRTSPMedia * media)
{
  GstStructure *stats;

  GST_INFO ("source %p in session %p is active", source, session);

  g_object_get (source, "stats", &stats, NULL);
  if (stats) {
    gchar *sstr;

    sstr = gst_structure_to_string (stats);
    g_print ("structure: %s\n", sstr);
    g_free (sstr);

    gst_structure_free (stats);
  }
}

static void
on_sender_ssrc_active (GObject * session, GObject * source,
    GstRTSPMedia * media)
{
  GstStructure *stats;

  GST_INFO ("source %p in session %p is active", source, session);

  g_object_get (source, "stats", &stats, NULL);
  if (stats) {
    gchar *sstr;

    sstr = gst_structure_to_string (stats);
    g_print ("Sender stats:\nstructure: %s\n", sstr);
    g_free (sstr);

    gst_structure_free (stats);
  }
}

/* signal callback when the media is prepared for streaming. We can get the
 * session manager for each of the streams and connect to some signals. */
static void
media_prepared_cb (GstRTSPMedia * media)
{
  guint i, n_streams;

  n_streams = gst_rtsp_media_n_streams (media);

  GST_INFO ("media %p is prepared and has %u streams", media, n_streams);

  for (i = 0; i < n_streams; i++) {
    GstRTSPStream *stream;
    GObject *session;

    stream = gst_rtsp_media_get_stream (media, i);
    if (stream == NULL)
      continue;

    session = gst_rtsp_stream_get_rtpsession (stream);
    GST_INFO ("watching session %p on stream %u", session, i);

    g_signal_connect (session, "on-ssrc-active",
        (GCallback) on_ssrc_active, media);
    g_signal_connect (session, "on-sender-ssrc-active",
        (GCallback) on_sender_ssrc_active, media);
  }
}

static void
media_configure_cb (GstRTSPMediaFactory * factory, GstRTSPMedia * media)
{
  /* connect our prepared signal so that we can see when this media is
   * prepared for streaming */
  g_signal_connect (media, "prepared", (GCallback) media_prepared_cb, factory);
}

int
main (int argc, char *argv[])
{
  GMainLoop *loop;
  GstRTSPServer *server;
  GstRTSPMountPoints *mounts;
  GstRTSPMediaFactory *factory;
  GOptionContext *optctx;
  GError *error = NULL;
  gchar *str;

  optctx = g_option_context_new ("<filename.mp4> - Test RTSP Server, MP4");
  g_option_context_add_main_entries (optctx, entries, NULL);
  g_option_context_add_group (optctx, gst_init_get_option_group ());
  if (!g_option_context_parse (optctx, &argc, &argv, &error)) {
    g_printerr ("Error parsing options: %s\n", error->message);
    g_option_context_free (optctx);
    g_clear_error (&error);
    return -1;
  }

  if (argc < 2) {
    g_print ("%s\n", g_option_context_get_help (optctx, TRUE, NULL));
    return 1;
  }
  g_option_context_free (optctx);

  loop = g_main_loop_new (NULL, FALSE);

  /* create a server instance */
  server = gst_rtsp_server_new ();
  g_object_set (server, "service", port, NULL);

  /* get the mount points for this server, every server has a default object
   * that be used to map uri mount points to media factories */
  mounts = gst_rtsp_server_get_mount_points (server);

  str = g_strdup_printf ("( "
      "filesrc location=\"%s\" ! qtdemux name=d "
      "d. ! queue ! h264parse ! rtph264pay pt=96 name=pay0 " ")", argv[1]);
//      "d. ! queue ! rtpmp4apay pt=97 name=pay1 " ")", argv[1]);

  /* make a media factory for a test stream. The default media factory can use
   * gst-launch syntax to create pipelines. 
   * any launch line works as long as it contains elements named pay%d. Each
   * element with pay%d names will be a stream */
  factory = gst_rtsp_media_factory_new ();
  /* Set only TCP transport for stream ========================================== */
  gst_rtsp_media_factory_set_protocols(factory, GST_RTSP_LOWER_TRANS_TCP);
  /* ============================================================================ */
  gst_rtsp_media_factory_set_launch (factory, str);
  g_signal_connect (factory, "media-configure", (GCallback) media_configure_cb,
      factory);
  g_free (str);

  /* attach the test factory to the /test url */
  gst_rtsp_mount_points_add_factory (mounts, "/test", factory);

  /* don't need the ref to the mapper anymore */
  g_object_unref (mounts);

  /* attach the server to the default maincontext */
  gst_rtsp_server_attach (server, NULL);

  /* start serving */
  g_print ("stream ready at rtsp://127.0.0.1:%s/test\n", port);
  g_main_loop_run (loop);

  return 0;
}

Hi,
We can reproduce the issue. Will check if we can support TCP mode in future releases. Please run UDP mode on DS4.0.1.

Hi,

I have two main problems with UDP

  1. Network topology in most real-world cases (CCTV systems) doesn’t allow to use UPD transport between devices.
  2. Our cameras do not support UDP transport at all.

And again. DeepStream currently support TCP. Without image manipulation everything works fine. It’s the nvvideoconvert itself or intercommunication between nvv4l2decoder and nvvideoconvert causes the issue.

I have the exact same problems as Pavel and his analysis is 100% correct - this is not a feature request, this is a bug. This makes any sort of installable appliance a serious hassle, because we are forced to fall back on software decoding. Software decode is prohibitively CPU intensive if you have several cameras (which I do in many cases).

Hello DaneLLL,

Do you have any updates about the issue? It is real problem for us because not cropped FullHD frame from camera causes significant latency during inference, which leads to wrong results in other components. So the issue is a real blocker for our Jetson Nano project.

Thanks.

Hi,
For running r32.2.1+DS4.0.1 on Jetson platforms, please try the attachment.
r32_21_JETSON_TEST_libgstnvvideoconvert.zip (176 KB)

Hello,

I can confirm that new version of library works fine with TCP+src-crop pipeline. Excellent!

DaneLLL, thank you very much for your support.

Any plans to include fixed library in near-future release of DeepStream SDK?

Hi,

Yes, it will be fixed in next release. Please use the attached library as a quick fix on DS4.0.1.

Is there a version that will work on dGPUs / T4s as well? That would be awesome.