gst-rtsp-server problem on Jetson TX2

Hi,
I’m a total amateur in this field. I have a running application that can send UDP video stream from the target (Jetson TX2 running jetpack 3.3) to the host computer (Windows 10). Now, I need to use RTSP to send video. But it always fails to do so. I can run the examples “test-launch” and “test-mp4” successfully. “test-launch” only works with videotestsrc and renders h265. I need to get video from an HDMI/IR camera using v4l2src, encode as h264 using the HW encoder and then send to my host computer. I’ve looked through this forum and googled around but no luck yet. I’ve tried to adopt the examples in my application as well. But before that, I need to make sure it works well with the camera. I had an older version of the application that works fine on jetpack 3.1. So, what do I need to do to make it work in jetpack 3.3? Right, I also had a problem to build the gst-rtsp-server on jetpack 3.3. The build process I followed was clone the gst-rtsp-server from GStreamer git repository (https://github.com/GStreamer/gst-rtsp-server.git) and then in the root directory, I ran autogen.sh. It fails to find GStreamer. But I already have GStreamer v1.8.3 installed with all the plugins and dev modules on my Jetson TX2 (jetpack 3.3). Yet it fails to find GStreamer. I’m not sure what’s causing this problem. Is it a directory problem? To bypass this error, I just copied pre-compiled gst-rtsp-server from my older version (jetpack 3.1) and then build again on jetpack 3.3. My target is to build gst-rtsp-server properly and then use the examples with the camera. Any suggestions are greatly appreciated.

Hi,
Please refer to this post
You can simply install libgstrtspserver-1.0 through ‘apt-get install’

Please also check this post to run your v4l2src in support format.

Thanks a lot, DaneLLL. You’re always the one. I’ve managed to work with 3.1. But surely I will keep working on Jetpack 3.3. One of our teammates previously tried to work with the deb package, but seems to doesn’t work with our application. That’s why we need to build gst-rtsp-server from the source. I’ll try and keep you updated.

I managed to run “test-launch”. I have got a couple of doubts. I’m not sure what I’m doing wrong here. Here are my doubts:

  1. What is the exact reason test-launch doesn’t work with omxh264enc?
  2. On VLC player, I could run this pipeline and got about 300 ms latency:
./test-launch "v4l2src device=/dev/video0 ! video/x-raw, width=1920, height=1080, framerate=60/1, format=I420 ! nvvidconv ! video/x-raw(memory:NVMM), width=1920, height=1080, format=I420 ! omxh265enc bitrate=2500000 ! video/x-h265, width=1920, height=1080, stream-format=byte-stream, bitrate=5000 ! rtph265pay name=pay0 pt=96"
  1. I adopted the code from “test-appsrc” that comes with gst-rtsp-server source code. And then integrated with my application that also runs other gstreamer pipeline.
GMainLoop *loop;
	GstRTSPServer *server;
	GstRTSPMountPoints *mounts;
	GstRTSPMediaFactory *factory;
	GstRTSPMediaFactory *factory1;
	GstRTSPSessionPool *session;
	//gst_init (&argc, &argv);

	loop = g_main_loop_new (NULL, FALSE);
	session = gst_rtsp_session_pool_new();
	gst_rtsp_session_pool_set_max_sessions(session,255);
	/* create a server instance */
	server = gst_rtsp_server_new ();

	/* get the mount points for this server, every server has a default object
	* that be used to map uri mount points to media factories */
	mounts = gst_rtsp_server_get_mount_points (server);

	/* make a media factory for a test stream. The default media factory can use
	* gst-launch syntax to create pipelines. 
	* any launch line works as long as it contains elements named pay%d. Each
	* element with pay%d names will be a stream */
	factory = gst_rtsp_media_factory_new ();
	factory1 = gst_rtsp_media_factory_new ();

	//====================================================================
	gst_rtsp_media_factory_set_launch (factory,"(""udpsrc  port=8001 caps=application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H265 ! rtph265depay  ! rtph265pay pt=96 name=pay0 " ")");
	gst_rtsp_media_factory_set_launch (factory1,"(""udpsrc  port=8009 caps=application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H265 ! rtph265depay !  rtph265pay pt=96 name=pay0 " ")");

	gst_rtsp_media_factory_set_shared (factory, TRUE);
	gst_rtsp_media_factory_set_shared (factory1,TRUE);
	/* attach the test factory to the /test url */
	gst_rtsp_mount_points_add_factory (mounts, "/test", factory);
	gst_rtsp_mount_points_add_factory (mounts, "/test1", factory1);
	/* don't need the ref to the mapper anymore */
	g_object_unref (mounts);

	/* attach the server to the default maincontext */
	gst_rtsp_server_attach (server, NULL);
	//g_timeout_add_seconds(2,(GSourceFunc)timeout,server);
	/* start serving */
	g_print ("stream ready at rtsp://127.0.0.1:8554/test\n");
	g_main_loop_run (loop);

I get more than 15 s latency here.

I’m not sure if h264 video would be any better than this. Is there any other mechanism to do this? How can I reduce the latency? I’m really stuck here. Thanks again.

Hi,
For using omxh264enc in test-launch, you may check
https://devtalk.nvidia.com/default/topic/1028201/jetson-tx2/send-tx2-encoded-stream-over-wifi/post/5275185/#5275185
https://devtalk.nvidia.com/default/topic/1039201/jetson-tx2/omxh264enc-with-gst-rtsp-server-1-8-3-tar-xz/post/5280155/#5280155