Gstreamer nv_omx_h264enc codec problem with udp stream.


I would like to encode video on my Tegra T30 Apalis board with L4T (Angstrom Linux) to H.264 and send it over UDP. When I run the statement:
gst-launch -v videotestsrc ! ffmpegcolorspace ! ‘video/x-raw-yuv’ ! nv_omx_h264enc ! rtph264pay ! fakesink silent=0 (or with udpsink)

i got this error: “Element doesn’t implement handling of this stream.”
The whole error description is below.
I tried many similar statements and compile the app in C with the same result. The problem is really between the nv_omx_h264enc and the rtph264pay plugins. I tried the x264enc encoder instead of the nv_omx_h264enc and it works. In rtph264pay sources i see that “payload” hasn’t adjusted a clock-rate, but i don’t know where to fix it. Maybe it is a plugin version problem.

gst-launch-0.10 version 0.10.36
gst-plugins-good - 0.10.31-r13.7
nv_omx_h264enc version 0.10.1

Does anybody have an idea?

Thank you.

ERROR: from element /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: Element doesn’t implement handling of this stream. Please file a bug.
Additional debug info:
/home/tegradev/oe-core/build/out-eglibc/work/armv7ahf-vfp-neon-angstrom-linux-gnueabi/gst-plugins-base/0.10.36-r12/git/gst-libs/gst/rtp/gstbasertppayload.c(850): gst_basertppayload_prepare_push (): /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0:
subclass did not specify clock-rate
ERROR: pipeline doesn’t want to preroll.

Hi ozzyator,
This is how you can get it to work:

  1. Shift to latest L4T release, it should have support for raw-yuv input to nv_omx_h264enc.

  2. nv_omx_h264enc only supports I420 YUV, so you need to specify that while taking your input from videotestsrc.

So on the new release, your transmit pipeline should look like this:

gst-launch -v videotestsrc ! ‘video/x-raw-yuv, width=320, height=240, format=(fourcc)I420’ ! nv_omx_h264enc ! ‘video/x-h264, stream-firmat=(string)byte-stream’ ! rtph264pay ! udpsink host= port= -e

(I am giving byte-stream because rtph264depay expects byte-stream by default)

Please let me know if you face any issues/need any more help.


You shouldn’t be using ffmpegcolorspace. That’s a software implementation and too slow for many use cases.

If your L4T build is based on the latest 16.3 release from nvidia, you should have a gst component called “nvvidconv”.

With that I’ve been doing H.264 streaming over UDP with something like this:

gst-launch-0.10 -v videotestsrc is-live=true do-timestamp=true ! capsfilter caps=‘video/x-raw-yuv,width=(int)640,height=(int)360,framerate=(fraction)30/1’ ! nvvidconv ! capsfilter caps=‘video/x-nvrm-yuv’ ! nv_omx_h264enc ! rtph264pay config-interval=1 ! fakesink

Thank you very much, that works.

My main problem was undefined caps between encoder and rtp payloader.