When I get video stream on localhost with udpsrc, the incoming video stream is lagged and mosaic

I have tried to get video stream on localhost with udpsrc.
When I get video stream on localhost with udpsrc, the incoming video stream is lagged and mosaic. I have change my h265 encoder and decoder than I have tried again. But nothing has changed

This one my server pipe:

gst-launch-1.0 v4l2src device=/dev/video8 io-mode=mmap ! video/x-raw width=1920, height=1080, framerate=25/1 ! nvvidconv ! omxh265enc bitrate=5000000 ! mpegtsmux ! udpsink host=10.0.3.30 port=11002

this one my reciever pipe:

gst-launch-1.0 udpsrc port=11002 ! video/mpegts, width=1920, height=1080, framerate=25/1 ! tsdemux ! h265parse, omxh265dec! nvvidconv ! omxh265enc bitrate=5000000 ! mpegtsmux name=mux ! filesink location=1.avi

Hi,
Please try nvv4l2h265enc and nvv4l2decoder. May refer to this example:
Gstreamer TCPserversink 2-3 seconds latency - #5 by DaneLLL

Another thing to try if you want to keep UDP would be using MP2T-RTP:

Sender:

gst-launch-1.0 v4l2src device=/dev/video8 io-mode=mmap ! video/x-raw, width=1920, height=1080, framerate=25/1 ! nvvidconv ! nvv4l2h265enc insert-sps-pps=1 idrinterval=25 insert-vui=1 ! mpegtsmux ! rtpmp2tpay ! udpsink  host=10.0.3.30 port=11002 auto-multicast=0

Receiver:

# Display:
gst-launch-1.0 udpsrc port=11002 ! application/x-rtp,encoding-name=MP2T,payload=33,clock-rate=90000 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! decodebin ! autovideosink

It is unclear why your receiver decodes and re-encodes into same properties.
Also note that currently you save with avi file extension, but you would save in ts format.
You would use avimux for creating an avi file, but it doesn’t support H265 (H264 is supported). For H265, you would use mastroskamux for MKV as container.

gst-launch-1.0 udpsrc port=11002 ! application/x-rtp,encoding-name=MP2T,payload=33,clock-rate=90000 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! h265parse ! matroskamux ! filesink location=test_h265.mkv
1 Like

I couldn’t find how to implement this “rtpmp2tpay” function(elements).
I’m developing in C language and I couldn’t find how it was written in the documents, in which procedure.
Can you help me how to add it to the code? Information on this element on the Internet is very limited.

The simplest would be to use gstreamer C API such as:

#include <gst/gst.h>

int main (gint argc, gchar * argv[])
{  
	gst_init (&argc, &argv);
	GMainLoop *loop = g_main_loop_new (NULL, FALSE);

	const gchar *pipeline_str = "v4l2src device=/dev/video8 io-mode=mmap ! video/x-raw, width=1920, height=1080, framerate=25/1 ! nvvidconv ! nvv4l2h265enc insert-sps-pps=1 idrinterval=25 insert-vui=1 ! mpegtsmux ! rtpmp2tpay ! udpsink  host=10.0.3.30 port=11002 auto-multicast=0";
	GstElement *pipeline = gst_parse_launch (pipeline_str, NULL);
	if (!pipeline) {
		g_error ("Failed to create pipeline\n");
		exit(-1);
	}

	/* Ok, successfully created the pipeline, now start it */
	gst_element_set_state (pipeline, GST_STATE_READY);
	gst_element_set_state (pipeline, GST_STATE_PLAYING);

	/* wait until it's up and running or failed */
	if (gst_element_get_state (pipeline, NULL, NULL, -1) == GST_STATE_CHANGE_FAILURE) {
		g_error ("Failed to go into PLAYING state");
		exit(-2);
	}

	g_print ("Running ...\n");
	g_main_loop_run (loop);

	return 0;
}

and build with:

gcc -Wall -o gst_testlaunch_rtpmp2t -I/usr/include/gstreamer-1.0 -I/usr/include/glib-2.0 -I/usr/lib/aarch64-linux-gnu/glib-2.0/include gst_testlaunch_rtpmp2t.cpp -lgstreamer-1.0 -lgobject-2.0 -lglib-2.0

If you want to implement by yourself, gstreamer is open source and you can find code from github (repo gst-plugins-good).

Not this way, I’m building a deep structure, I define the elements one by one, in short I dont use gst parse launch, is rtpmp2tpay gst_element_factory defined as make. I’m not sure if we define the Src and Sink values ​​as gst_caps_new_simple. can i get this information… G_OBJECT_SET What object are the values ​​attached to payload ,pay, src …etc ?

You may try something like (I don’t know your camera format, used YUY2 here):

#include <stdbool.h>
#include <gst/gst.h>

static GMainLoop *loop;

gint
main (gint   argc,
      gchar *argv[])
{
  GstElement *pipeline, *vidsrc, *conv, *enc, *mux, *pay, *udpsink;
  GstCaps *vidsrc2conv_caps, *conv2enc_caps, *enc2mux_caps, *mux2pay_caps, *pay2udp_caps;

  /* init GStreamer */
  gst_init (&argc, &argv);
  loop = g_main_loop_new (NULL, FALSE);

  /* setup pipeline */
  pipeline = gst_pipeline_new ("pipeline");

  vidsrc = gst_element_factory_make ("v4l2src", "vidsrc");
  g_object_set (G_OBJECT (vidsrc), "device", "/dev/video8", NULL);
  g_object_set (G_OBJECT (vidsrc), "io-mode", 2, NULL);

  vidsrc2conv_caps = gst_caps_from_string("video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)25/1, format=(string)YUY2");

  conv = gst_element_factory_make ("nvvidconv", "conv");

  conv2enc_caps = gst_caps_from_string("video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12");

  enc = gst_element_factory_make  ("nvv4l2h264enc", "enc");
  g_object_set (G_OBJECT (enc), "insert-sps-pps", true, NULL);
  g_object_set (G_OBJECT (enc), "insert-vui", true, NULL);
  g_object_set (G_OBJECT (enc), "idrinterval", 25, NULL);

  enc2mux_caps = gst_caps_from_string("video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1");

  mux = gst_element_factory_make ("mpegtsmux", "mux");

  mux2pay_caps = gst_caps_from_string("video/mpegts, systemstream=(boolean)true, packetsize=(int)188");

  pay = gst_element_factory_make ("rtpmp2tpay", "pay");

  pay2udp_caps = gst_caps_from_string("application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T, payload=(int)33");

  udpsink = gst_element_factory_make ("udpsink", "udpsink");
  g_object_set (G_OBJECT (udpsink), "host", "10.0.3.30", NULL);
  g_object_set (G_OBJECT (udpsink), "port", 11002, NULL);
  g_object_set (G_OBJECT (udpsink), "auto-multicast", false, NULL);

  gst_bin_add_many (GST_BIN (pipeline), vidsrc, conv, enc, mux, pay, udpsink, NULL);

  if (!gst_element_link_filtered(vidsrc, conv, vidsrc2conv_caps)) {
        g_printerr("Fail to gst_element_link_filtered vidsrc -> conv\n");
        return -1;
  }

  if (!gst_element_link_filtered(conv, enc, conv2enc_caps)) {
        g_printerr("Fail to gst_element_link_filtered conv -> enc\n");
        return -1;
  }

  if (!gst_element_link_filtered(enc, mux, enc2mux_caps)) {
        g_printerr("Fail to gst_element_link_filtered enc -> mux\n");
        return -1;
  }

  if (!gst_element_link_filtered(mux, pay, mux2pay_caps)) {
        g_printerr("Fail to gst_element_link_filtered mux -> pay\n");
        return -1;
  }

  if (!gst_element_link_filtered(pay, udpsink, pay2udp_caps)) {
        g_printerr("Fail to gst_element_link_filtered pay -> udpsink\n");
        return -1;
  }
  /* Ok, pipeline successfully created */

  /* This will output details of pipeline going to play */
  g_signal_connect(pipeline, "deep-notify", G_CALLBACK(gst_object_default_deep_notify), NULL);

  /* Set pipeline playing */
  gst_element_set_state (pipeline, GST_STATE_PLAYING);
  g_main_loop_run (loop);

  /* Clean up */
  gst_element_set_state (pipeline, GST_STATE_NULL);
  gst_object_unref (GST_OBJECT (pipeline));
  g_main_loop_unref (loop);

  return 0;
}
1 Like

Ty for helping sir…
My Code little different.
I used gst_caps_new_simple, Is it possible for me to use this function (gst_caps_new_simple) multiple ?
Pipeline = gst_element_link_many = src, conv, enc,muxer, pay,overlay, tee,
queue_udp, udpsink,
queue_file, filesink,
. ERROR Gstreamer-CRITICAL gst_element_link_pads_full: assertion GST_IN_ELEMENT (dest) failed

In short, can I use gst_caps_new_simple 2 times at the same time.
example = main_caps = gst_caps_new_simple (“video/x-raw-yuv”,
“format”, GST_TYPE_FOURCC, fourcc,
“framerate”, GST_TYPE_FRACTION,
dec->info.fps_numerator, dec->info.fps_denominator,
“pixel-aspect-ratio”, GST_TYPE_FRACTION, par_num, par_den,
“width”, G_TYPE_INT, dec->width, “height”, G_TYPE_INT, dec->height,
“color-matrix”, G_TYPE_STRING, “sdtv”,
“chroma-site”, G_TYPE_STRING, “jpeg”, NULL);
g_object_set(G_OBJECT(src), “caps”, main_caps, NULL);
gst_caps_unref (main_caps);

second_caps = gst_caps_new_simple (“video/mpegts”,
“packetsize”,G_TYPE_INT, 188,
“systemstream”, G_TYPE_BOOLEAN, TRUE,
NULL);
g_object_set(G_OBJECT(pay), “caps”, second_caps, NULL);
gst_caps_unref (second_caps);
…
Is such a definition possible?

There are plugins that have properties. You can use g_object_set for setting properties of a previously created plugin.

There are caps, that define the format of data between 2 plugins. These caps must match both producer plugin SRC pad and consumer SINK pad.

It may get confusing when some plugins such as udpsink have a caps property.
What you are trying can only work with these plugins.
rtpmp2tpay doesn’t have such a caps property (nor have nvvidconv, nvv4l2h264enc,…).
You would check a plugin’s properties and SRC and SINK caps with:

gst-inspect-1.0 the_plugin_you_want_to_inspect

Furthermore this wouldn’t ensure that caps are matching next plugin input until you’ve linked to it.

You may better consider adapting your code than expecting someone to adapt your code ;-)
The example above should be enough for a working solution to topic’s question. If you still experience mosaicing because an intra frame was lost, then move to TCP transport.
You can also use a debugger for checking caps returned by the example if you really want using gst_caps_new_simplefor building caps, but it may not be easy (especially for NVMM memory attribute).

1 Like

I understand, just know gst_caps_new_simple used 2 more …
rtpmp2tpay What can we use instead, I didn’t ask for code, I just wanted to learn the correct usage, you wrote it as code.
Ty so much.

The structure I built is very deep, my code is already written, but I could not solve the problem caused by rtpmp2pay and I needed information about multiple caps, many explanations are available in their own document , i just wanted help about the information limited ones .

I fail to understand what you’re asking for.
If it is just using gst_caps_new_simple here are the caps before and after rtpmp2tpay:

...

  //mux2pay_caps = gst_caps_from_string("video/mpegts, systemstream=(boolean)true, packetsize=(int)188");
  mux2pay_caps = gst_caps_new_simple("video/mpegts", 
	"systemstream", G_TYPE_BOOLEAN, true,
	"packetsize", G_TYPE_INT, 188,
	NULL);

  pay = gst_element_factory_make ("rtpmp2tpay", "pay");

  //pay2udp_caps = gst_caps_from_string("application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T, payload=(int)33");
  pay2udp_caps = gst_caps_new_simple("application/x-rtp",
	 "media", G_TYPE_STRING, "video",
	 "encoding-name", G_TYPE_STRING, "MP2T",
	 "clock-rate", G_TYPE_INT, 90000,
	 "payload", G_TYPE_INT, 33,
	 NULL);

  udpsink = gst_element_factory_make ("udpsink", "udpsink");
  g_object_set (G_OBJECT (udpsink), "host", "10.0.3.30", NULL);
  g_object_set (G_OBJECT (udpsink), "port", 11002, NULL);
  g_object_set (G_OBJECT (udpsink), "auto-multicast", false, NULL);

...

For summarizing up on this topic, I think that:

  • You’re seeing mosaicing because an I frame was lost in UDP traffic (Note that UDP is just datagrams with no control-flow).
  • There are many possible causes for hogging your UDP stack or network.
  • So streaming MP2TS directly on UDP may not be reliable.
  • A first step would be using RTP-MP2T with rtpmp2tpay on sender and rtpmp2tdepay on receiver. This may help.
  • If not enough, or if loosing an I frame is not affordable, you would have to move to TCP transport.

If not enough, you may better explain your expectation into another topic, but as it may be a gstreamer more than a Jetson-related issue, you may better post it into gstreamer devel forum.

1 Like

Hi again. The above pipeline works just great on command line but when we try to use the same receive pipeline in python script, the saved video flows real fast. It does not match the original 25 fps. Also the timestamp does not match. Any support would be really appreciated.

Please help to keep this topic clear with an answer to topic’s title as it is what next users will find when searching.

Please help creating another topic for your issue which is different from this one.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.