使用GSTreamer将实时视频流压缩为H.264格式并通过rtsp协议传输给另一台主机

最终的目标是:在Jetson Orin Nano上连接两个IMX219摄像头,并在运行时,将这个两个摄像头的实时视频流通过GSTreamer压缩成H.264格式,并且通过rtsp协议传输到另一台主机上。

当前我需要将一个IMX219的摄像头的实时视频流进行压缩,我想知道官方是否提供了相关例程或者资料?
我发现GSTreamer的使用说明都是通过终端来进行,是否可以通过C语言程序来实现?

当我用putty远程连接jetson orin nano,并且试图用
gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),
format=NV12, width=1920, height=1080’ !
nvv4l2h264enc insert-sps-pps=true ! h264parse !
rtph264pay pt=96 ! udpsink host=127.0.0.1 port=8001 sync=false -e
来进行测试的时候,终端显示了这样的问题:
Setting pipeline to PAUSED …
ERROR: Pipeline doesn’t want to pause.
ERROR: from element /GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0: Cannot identify device ‘/dev/v4l2-nvenc’.
Additional debug info:
/dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/v4l2_calls.c(637): gst_v4l2_open (): /GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0:
system error: No such file or directory
Setting pipeline to NULL …
Freeing pipeline …
我想知道这个错误信息代表了什么,我该如何解决

Hi,
Orin Nano does not have hardware encoder. Please use software encoder like x264enc plugin.

你好,我在我的远程终端中执行该命令:gst-inspect-1.0 x264enc

终端信息如下:

nvidia-user@tegra-ubuntu:~$ gst-inspect-1.0 x264enc
Factory Details:
  Rank                     primary (256)
  Long-name                x264enc
  Klass                    Codec/Encoder/Video
  Description              H264 Encoder
  Author                   Josef Zlomek <josef.zlomek@itonis.tv>, Mark Nauwelaerts <mnauw@users.sf.net>

Plugin Details:
  Name                     x264
  Description              libx264-based H264 plugins
  Filename                 /usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstx264.so
  Version                  1.16.2
  License                  GPL
  Source module            gst-plugins-ugly
  Source release date      2019-12-03
  Binary package           GStreamer Ugly Plugins (Ubuntu)
  Origin URL               https://launchpad.net/distros/ubuntu/+source/gst-plugins-ugly1.0

GObject
 +----GInitiallyUnowned
       +----GstObject
             +----GstElement
                   +----GstVideoEncoder
                         +----GstX264Enc

Implemented Interfaces:
  GstPreset

Pad Templates:
  SINK template: 'sink'
    Availability: Always
    Capabilities:
      video/x-raw
              framerate: [ 0/1, 2147483647/1 ]
                  width: [ 16, 2147483647 ]
                 height: [ 16, 2147483647 ]
                 format: { (string)Y444, (string)Y42B, (string)I420, (string)YV12, (string)NV12, (string)Y444_10LE, (string)I422_10LE, (string)I420_10LE }

:

这似乎说明我已经安装了这个插件,那么我该怎么使用来解决之前的报错呢?

因为我想将摄像头的实时视频流压缩成h264格式,
在官方文档中找的gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),
format=NV12, width=1920, height=1080’ !
nvv4l2h264enc insert-sps-pps=true ! h264parse !
rtph264pay pt=96 ! udpsink host=127.0.0.1 port=8001 sync=false -e
这个命令,是否对于jetson orin nano来说有别的命令?或者说对于我的需求有别的相关文档

Hi,
Please try

$ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12, width=1920, height=1080' ! nvvidconv ! video/x-raw,format=I420 ! x264enc ! h264parse ! rtph264pay pt=96 ! udpsink host=127.0.0.1 port=8001 sync=false -e

hi,i tried this cmd:
gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),format=NV12, width=1920, height=1080’ ! nvvidconv ! video/x-raw,format=I420 ! x264enc ! h264parse ! rtph264pay pt=96 ! udpsink host=127.0.0.1 port=8001 sync=false -e

日志信息如下:

nvidia-user@tegra-ubuntu:~$ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12, width=1920, height=1080' ! nvvidconv ! video/x-raw,format=I420 ! x264enc ! h264parse ! rtph264pay pt=96 ! udpsink host=127.0.0.1 port=8001 sync=false -e
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0
   Camera mode  = 2
   Output Stream W = 1920 H = 1080
   seconds to Run    = 0
   Frame Rate = 29.999999
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.

但是当我在jetson orin nano上执行该命令后,没有画面出现,这可能是什么原因
(我用nvgstcapture-1.0测试画面是正常工作的)

Hi,
Please refer to
Gstreamer TCPserversink 2-3 seconds latency - #5 by DaneLLL

You need to run a client command on Orin Nano or another device.

hi,
我又尝试了这个命令:
gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! nvv4l2h264enc bitrate=8000000 ! h264parse ! qtmux ! filesink location=~/Desktop/H264-realtime.mp4 -e

似乎是将摄像头的实时视频流捕获并通过H264压缩成.mp4文件,但是当我执行这个命令的时候,终端出现了以下错误:


nvidia-user@tegra-ubuntu:~/Desktop$  gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1' ! nvv4l2h264enc bitrate=8000000 ! h264parse ! qtmux ! filesink location=~/Desktop/H264-realtime.mp4 -e
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0: Cannot identify device '/dev/v4l2-nvenc'.
Additional debug info:
/dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/v4l2_calls.c(637): gst_v4l2_open (): /GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0:
system error: No such file or directory
Setting pipeline to NULL ...
Freeing pipeline ...

我不清楚这是什么原因,对于jetson orin nano这个开发板来说,是否需要修改命令参数,该怎么修改?

你好,根据你的链接,我进行了尝试,但失败了,信息如下:

我用Jetson Orin Nano做服务端:

gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! tee name=t ! nvv4l2h264enc insert-sps-pps=true idrinterval=15 ! h264parse ! rtph264pay ! udpsink host=10.19.106.10 port=5000 sync=0 t. ! queue ! nvegltransform ! nveglglessink sync=0

执行这个命令时出现如下错误:

nvidia-user@tegra-ubuntu:~/Desktop$ gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! tee name=t ! nvv4l2h264enc insert-sps-pps=1 idrinterval=15 ! h264parse ! rtph264pay ! udpsink host=192.168.169.134 port=5000 sync=0 t. ! queue ! nvegltransform ! nveglglessink sync=0
Setting pipeline to PAUSED ...

Using winsys: x11
ERROR: Pipeline doesn't want to pause.
Got context from element 'eglglessink0': gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
ERROR: from element /GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0: Cannot identify device '/dev/v4l2-nvenc'.
Additional debug info:
/dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/v4l2_calls.c(637): gst_v4l2_open (): /GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0:
system error: No such file or directory
Setting pipeline to NULL ...
Freeing pipeline ...

客户端没有报错,对于服务端的命令,应该如何修改?

Hi,
Most reference commands use hardware encoder. Since you use Orin Nano, please remember to replace with software encoder.

你好,
我现在在jetson orin nano上执行了捕获视频流传输给本地的命令:

第一个终端执行了:
gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),format=NV12, width=1920, height=1080’ ! nvvidconv ! video/x-raw,format=I420 ! x264enc ! h264parse ! rtph264pay pt=96 ! udpsink host=0.0.0.0 port=8001 sync=false –e

第二个终端执行了:
gst-launch-1.0 udpsrc address=192.168.1.244 port=8001
caps=‘application/x-rtp, encoding-name=(string)H264, payload=(int)96’ !
rtph264depay ! queue ! h264parse ! nvv4l2decoder ! nv3dsink -e

但是发现了一个问题:
当我在 width=1920, height=1080设置成这样的时候,出现的画面十分的卡,并且有很高的延时,
只有当我把width=640,height=320的时候,出现的画面才开始流畅,但是依然有大约2s的延时,此时cpu占比220%
当我把width=1280,height=720,出现的画面会时不时卡一下,此时cpu占比480%
我不太理解为什么会这样,按理说这个开发板支持19201280的分辨率,但是这个命令的分辨率最高只支持640320

并且这个2s的延时我也不理解为什么

Hi,
Please execute sudp nvpmodel -m 0 and sudo jetson_clcoks to run CPU cores at maximum frequency. Since h264 encoding is done on CPU.

And you can consider use Orin NX or Xavier NX to utilize hardware encoder.

有所改善,但还是存在我描述的问题。
对于这个,除了使用带硬件编码器的开发板之外,没有适用于jetson orin nano更好的办法了吗

另外,我还想尝试一下:
在jetson orin nano中捕获视频流,压缩成H.264格式,并传输到我在VMware中的ubuntu虚拟机上,你可以提供下对应的命令吗

Hi,
You can run the gstreamer command on Ubuntu to receive and decode UDP stream:

$ gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! avdec_h264 ! xvimagesink sync=0

You are now running optimal solution on Orin Nano. For further enhancement, hardware encoder is required and it is better to try Xavier NX or Orin NX.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.