v4l2-ctl capture works, gstreamer fails

We’ve got the following configuration defined:

  • csi 4 lane ar0231 sensor
  • GMSL2 SerDes to CSI-2 interface
  • Xavier running r31.1 release
  • raw 12 bayer pattern GRBG

With the help of fellow DevTalk community member Greg Rowe, we are able to capture single and multiple frames reliably using v4l2-ctl. (ref: https://devtalk.nvidia.com/default/topic/1051117/jetson-agx-xavier/csi-camera-capture-crash-causes-quot-vi-capture-dequeue-status-failed-quot-/?offset=2#5334740)

Example single frame capture:

v4l2-ctl -d /dev/video0 --set-fmt-video=width=1928,height=1208,pixelformat=BA12 --set-ctrl=sensor_mode=0 --stream-mmap --stream-count=1 --set-ctrl bypass_mode=0 --stream-to=ar0231.raw

Example 90 frame capture:

v4l2-ctl -d /dev/video0 --set-fmt-video=width=1928,height=1208,pixelformat=BA12 --set-ctrl=sensor_mode=0 --stream-mmap --stream-count=90 --set-ctrl bypass_mode=0 --stream-to=ar0231.raw

Converting this from Bayer format to RGB using raw2rgbpnm (ref: git://salottisipuli.retiisi.org.uk/~sailus/raw2rgbpnm.git) gives us results that we expect.

When we try to use gstreamer, however, we encounter an “Internal data stream error.” I’ve reduced the command to the bare minimum and still see the problem. Here’s a capture with GST_DEBUG=“*:2”:

nvidia@jetson-0423418010090:~$ gst-launch-1.0 -v v4l2src device="/dev/video0" ! fakesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
0:00:00.083802415 18524   0x5587c52230 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<v4l2src0> error: Internal data stream error.
0:00:00.083892755 18524   0x5587c52230 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<v4l2src0> error: streaming stopped, reason not-negotiated (-4)
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Since v4l2-ctl works, I assumed the v4l2src plugin would work. But, that does not seem to be the case.

I’ve found similar issues described here on the Community forums, but none of the threads so far have been much help. I’ve double-checked that the devname under the tegra-camera-platform node is matches the device. I’m less sure about the various configuration settings (max_pixel_rate, isp_bw_margin_pct, etc).

Can anyone share their insight on what may be the source of this problem?

Thanks,
–tim

I’ve tried the accepted answer from here [url]https://devtalk.nvidia.com/default/topic/1036600/jetson-tx2/imx214-camera-works-with-v4l2-ctl-but-not-gstreamer/post/5266537/#5266537[/url] with no luck.

–tim

Not sure for recent releases, but gstreamer may only be able to handle 8 bits bayer. If your sensor driver provides an 8 bits bayer format and loss of precision is affordable, you may give try it.

Offline, if you capture to file with v4l and want to use gstreamer for further processing, you may have a look to this post and adapt for 12 bits (it was for 10bits) in order to generate 8 bits bayer suitable for gstreamer.

@Honey_Patouceul:

Thanks for the tip! I had come across your post a few weeks ago about the gstreamer 8-bit bayer limitation and completely forgot about it. Your post here pointed me in the right direction.

For others that run into similar issues, I’ve been able to get things to work by doing the following:

  1. Downloading the NVIDIA Accelerated GStreamer User Guide and following the “GSTREAMER BUILD INSTRUCTIONS” for rebuilding gstreamer 1.14.4 for our L4T 31.1 platform.

  2. Updating and recompiling gstreamer’s gstv4l2object.c to add support for our GRBG12 format. (See RidgeRun wiki for what to change.)

  3. Downloading and compiling the “videoadjust” library from gst-plugins-vision Github repo which gives you ‘videolevels’. (Note also that I needed to copy the resulting lib into the manually created gstreamer lib/ path and define GST_PLUGIN_PATH to get this to work.) Further, I added a “stride_in” parameter to this plugin which allows me to override the stride value that it uses.

  4. Finally, I can render using the following pipeline:

gst-launch-1.0 v4l2src device="/dev/video0" ! "video/x-bayer,width=1920,height=1208,bpp=16,stride_in=3840" ! videolevels ! bayer2rgb ! nveglglessink

The downside is that this pipeline pegs the CPU at ~99%. Using ‘top’, fakesink and slowly building up the pipeline shows that the CPU utilization spikes when introducing ‘videolevels’. As a result, I’m only getting maybe 1fps vs the 30fps I’d expect. But, for right now, it’s good enough for my testing purposes.

–tim

Tim,

Since you are using a bayer sensor wouldn’t you want to take advantage of the ISP? You’ll get bebayering, color correction (if tuned properly), and other image corrections for “free”.

The easiest test for bayer sensors is to run nvgstcapture.

Another test is:

gst-launch-1.0 nvarguscamerasrc \
    ! "video/x-raw(memory:NVMM)" \
    ! nvoverlaysink -ev

I will caution that, if you do not have certain device tree properties set correctly in tegra-camera-platform, this will no work but it’s worth the effort to take advantage of the ISP processing and save your CPU and GPU cycles for better things!

It just so happens that I have one of our 6 camera RSP boards on my desk with 6 ar0231 sensors attached. When I run the gst pipeline from above I get 30 fps and it’s fully debayered and color corrected. The CPU stats on this Tx2 system are below.

RAM 1520/7860MB (lfb 1411x4MB) CPU [8%@1113,off,off,10%@1113,11%@1113,9%@1113] EMC_FREQ 0% GR3D_FREQ 0% PLL@38C MCPU@38C PMIC@100C Tboard@35C GPU@37C BCPU@38C thermal@37.7C Tdiode@36.25C VDD_SYS_GPU 75/75 VDD_SYS_SOC 529/529 VDD_4V0_WIFI 0/0 VDD_IN 2777/2777 VDD_SYS_CPU 151/151 VDD_SYS_DDR 615/615

mta: nvgstcapture
mta: performance stats, simplified pipeline

Greg,

What magic is required to get “gst-launch-1.0 nvarguscamerasrc” working? I finally got around trying to use “gst-launch-1.0 nvarguscamerasrc” like you mention above and it keeps reporting “521 No cameras available”.

I’ve poked around different threads on the forums here that have reported similar issues but haven’t uncovered anything that has helped. As far as I’m aware, I have my device tree tegra-camera-platform nodes configured correctly…but that apparently isn’t the case. Lacking access to the source that reports the error in the first place makes troubleshooting pretty frustrating. :P

As mentioned, capturing via v4l2-ctl works flawlessly. I’d like to see what I can do to take advantage of the ISP to do the heavy-lifting of bayer conversion for me.

Updates since my top post:

  • now using an ar0233 sensor (same Bayer GRBG12)
  • now running L4T r32.1

Thanks,
–tim

I’ve annotated the ar0233 driver with debug code.

Observations:

Capturing via v4l2-ctl for a single frame yields this execution sequence:

  • power_on
  • set_mode
  • start_streaming
  • write_table
  • stop_streaming
  • power_off

When I execute “gst-launch-1.0 nvarguscamerasrc ! fakesink”, I see that all of my configured sensors are being accessed. The execution sequence for the sensors is:

  • power_on
  • power_off

My trace code indicates that power_on is not throwing any sort of error, so I don’t know what’s failing that’s making it apparently jump to powering things off.

–tim

New topic created: nvarguscamerasrc: 521 no cameras available? - Jetson AGX Xavier - NVIDIA Developer Forums