Use CSI Camera inside a container on AGX/Nano with L4T 32.1

Hi folks,

I am currently using a Jetson AGX and a Jetson nano to do some computer vision. My goal is to access a CSI camera stream inside a container in order to run some image process algorithms. And as you can guess, I am encountering some issues. I try to use several methods and workarounds so I will try to make this topic as clear as possible.

Note: I have the same issue on the nano but I will focus on AGX here.

Note#2: it’s gonna be long, you may want to grab a coffee first…

Ok, let’s start.

# What I use:

Let me give some context first. This are the devices I am

Jetson AGX running on L4T 32.1

$~ cat /etc/nv_tegra_release
# R32 (release), REVISION: 1.0, GCID: 14531094, BOARD: t186ref, EABI: aarch64, DATE: Wed Mar 13 07:41:08 UTC 2019

CSI Camera OV5693

v4l2-ctl -d /dev/video0 --all
Driver Info (not using libv4l2):
	Driver name   : tegra-video
	Card type     : vi-output, ov5693 2-0036
	Bus info      : platform:15c10000.vi:2
	Driver version: 4.9.140

My AGX runs a custom kernel base on 4.9.140. This is basically the same than the one provided with Tegra 32.1 with few more modules activated. I cross-compiled the kernel on a Ubuntu 16.04, i.e. built and encrypt the boot image, then applied it on the AGX using dd to the right partition (as the Nvidia doc and some topics on this forum suggest).

Then I use a docker image that I have created. This image is based on Ubuntu18.04 and built on an arm64 host. The image contains the following libs installed:

  • Cuda 10.0.166
  • Tegra 32.1 libs (required to have nvargus and other Tegra libs)
  • Gstreamer 1.14.1 (same as on the host) (built with gst-install bin provide by Tegra Multimedia)
  • OpenCV 3.4.6 (built from source) with Cuda and Gstreamer enabled (a bit off-topic, but since we are here :) …).
/# python3 -c "import cv2; print(cv2.getBuildInformation())"

General configuration for OpenCV 3.4.6 =====================================
  Version control:               3.4.6

    ...

    GStreamer:
      base:                      YES (ver 1.14.1)
      video:                     YES (ver 1.14.1)
      app:                       YES (ver 1.14.1)
      riff:                      YES (ver 1.14.1)
      pbutils:                   YES (ver 1.14.1)
    libv4l/libv4l2:              1.14.2 / 1.14.2
    v4l/v4l2:                    linux/videodev2.h
    ...

  NVIDIA CUDA:                   YES (ver 10.0, CUFFT CUBLAS FAST_MATH)
    NVIDIA GPU arch:             72 62 53
    NVIDIA PTX archs:

    ...

And I run the container as follow:

docker run -it --rm --privileged -v /dev:/dev my-container

(Not sure if I really need to mount /dev as I used --privileged but ¯_(ツ)_/¯…)

# 1st Method: Access the CSI Camera from inside the container

This is what I do inside my container:

root@fcb7a6215166:/# nvargus-daemon &
[1] 10
root@fcb7a6215166:/# nvbuf_utils: Could not get EGL display connection
=== NVIDIA Libargus Camera Service (0.97.3)=== Listening for connections...
root@fcb7a6215166:/#
root@fcb7a6215166:/# GST_DEBUG=2 gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! videoconvert ! fakevideosink

(gst-launch-1.0:11): GStreamer-WARNING **: 07:42:14.048: External plugin loader failed. This most likely means that the plugin loader helper binary was not found or could not be run. You might need to set the GST_PLUGIN_SCANNER environment variable if your setup is unusual. This should normally not be required though.
nvbuf_utils: Could not get EGL display connection
0:00:00.075397456    11   0x55ba2ef630 WARN                     omx gstomx.c:2826:plugin_init: Failed to load configuration file: Valid key file could not be found in search dirs (searched in: /root/.config:/etc/xdg as per GST_OMX_CONFIG_DIR environment variable, the xdg user config directory (or XDG_CONFIG_HOME) and the system config directory (or XDG_CONFIG_DIRS)
0:00:00.129991621    11   0x55ba2ef630 WARN      GST_PLUGIN_LOADING gstplugin.c:792:_priv_gst_plugin_load_file_for_registry: module_open failed: libXv.so.1: cannot open shared object file: No such file or directory

(gst-launch-1.0:11): GStreamer-WARNING **: 07:42:14.132: Failed to load plugin '/usr/local/gstreamer/lib/aarch64-linux-gnu/gstreamer-1.0/libgstxvimagesink.so': libXv.so.1: cannot open shared object file: No such file or directory
0:00:00.144880726    11   0x55ba2ef630 WARN      GST_PLUGIN_LOADING gstplugin.c:792:_priv_gst_plugin_load_file_for_registry: module_open failed: libfaac.so.0: cannot open shared object file: No such file or directory

(gst-launch-1.0:11): GStreamer-WARNING **: 07:42:14.147: Failed to load plugin '/usr/local/gstreamer/lib/aarch64-linux-gnu/gstreamer-1.0/libgstfaac.so': libfaac.so.0: cannot open shared object file: No such file or directory
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
=== gst-launch-1.0[11]: Connection established (7F7C0061D0)SCF: Error NotSupported: EGL_EXT_platform_device not supported (in src/services/gl/GLService.cpp, function initialize(), line 125)
SCF: Error NotSupported:  (propagating from src/services/gl/GLService.cpp, function startService(), line 47)
SCF: Error NotSupported:  (propagating from src/components/ServiceHost.cpp, function startServices(), line 136)
SCF: Error InvalidState: Service not running (in src/services/gl/GLService.cpp, function stopService(), line 56)
SCF: Error InvalidState:  (propagating from src/components/ServiceHost.cpp, function stopServicesInternal(), line 198)
SCF: Error NotSupported:  (propagating from src/api/CameraDriver.cpp, function initialize(), line 168)
SCF: Error InvalidState: Services are already stopped (in src/components/ServiceHost.cpp, function stopServicesInternal(), line 183)
SCF: Error NotSupported:  (propagating from src/api/CameraDriver.cpp, function getCameraDriver(), line 109)
(Argus) Error NotSupported:  (propagating from src/api/GlobalProcessState.cpp, function createCameraProvider(), line 204)
=== gst-launch-1.0[11]: CameraProvider initialized (0x7f740541f0)Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:521 No cameras available

^C #--> Keyboard interrupt

handling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:05.586656357
Setting pipeline to PAUSED ...
Setting pipeline to READY ...

^C #--> Keyboard interrupt

=== gst-launch-1.0[11]: Connection closed (7F7C0061D0)=== gst-launch-1.0[11]: WARNING: CameraProvider was not destroyed before client connection terminated.=== gst-launch-1.0[11]:          The client may have abnormally terminated. Destroying CameraProvider...=== gst-launch-1.0[11]: CameraProvider destroyed (0x7f740541f0)
root@fcb7a6215166:/#
root@fcb7a6215166:/#
[1]+  Segmentation fault      (core dumped) nvargus-daemon

That line is probably is the more suspicious :

=== gst-launch-1.0[11]: Connection established (7F7C0061D0)SCF: Error NotSupported: EGL_EXT_platform_device not supported (in src/services/gl/GLService.cpp, function initialize(), line 125)

I know there are some symlinks to set/fix for Tegra and EGL and I think I have fixed most of them:

cd /usr/lib/aarch64-linux-gnu \
  && ln -sf libGL.so.1.0.0               libGL.so \
  && ln -sf libGL.so.1.0.0               libGL.so.1 \
  && ln -sf libGLX.so.0.0.0              libGLX.so \
  && ln -sf libGLX.so.0.0.0              libGLX.so.0 \
  && ln -sf libGLdispatch.so.0           libGLdispatch.so.0 \
  && ln -sf tegra/libcuda.so             libcuda.so \
  && ln -sf libEGL.so.1.0.0              libEGL.so \
  && ln -sf libEGL.so.1.0.0              libEGL.so.1 \
  && cd /usr/lib/aarch64-linux-gnu/tegra \
  && ln -sf libcuda.so.1.1               libcuda.so \
  && ln -sf libcuda.so.1.1               libcuda.so.1 \
  && ln -sf libnvidia-ptxjitcompiler.so.32.1 libnvidia-ptxjitcompiler.so \
  && ln -sf libnvidia-ptxjitcompiler.so.32.1 libnvidia-ptxjitcompiler.so.1 \

I don’t know if something is missing or if it is not possible (yet?).

# 2nd Method: Piping nvarguscamerasrc to a v4l2sink using v4l2loopback

I tried to use v4l2loopback as a workaround. The idea is to pipe the output of the CSI camera in a video device that I will mount in my container.

I compiled it in the same I compile my kernel and I load it on the AGX with modprobe:

~$ sudo modprobe v4l2loopback devices=2

FYI:
/dev/video0 is my CSI camera
/dev/video1 and /dev/video2 are v4l2loopback devices

So far, everything seems ok in dmesg:

[   60.592776] v4l2loopback: loading out-of-tree module taints kernel.
[   60.596520] v4l2loopback driver version 0.12.2 loaded

I have run few tests with videotestsrc, this works fine:

On the host :

$~ gst-launch-1.0 -v videotestsrc pattern=0 \
! videoconvert \
! "video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720" \
! v4l2sink device=/dev/video1

Then inside the container, I read the stream using a first pipeline

v4l2src device=/dev/video1 ! videoconvert ! video/x-raw, width=(int)1280, height=(int)720, format=(string)RGBx, framerate=(fraction)30/1 ! videoconvert ! appsink

… and I do some operations and stream out from the container using another pipeline

appsrc ! videoconvert ! video/x-raw, format=(string)RGB ! videoconvert ! v4l2sink device=/dev/video2

And display it on the host.

Once again, so far, so good.

But when I try to stream the CSI camera into a v4l2module something get wrong.

$~ gst-launch-1.0 -v nvarguscamerasrc sensor-id=0 \
! nvvidconv \
! videoconvert \
! v4l2sink device=/dev/video1


Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 2592 x 1944 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 16.000000; Exposure Range min 34000, max 550385000;

GST_ARGUS: 2592 x 1458 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 16.000000; Exposure Range min 34000, max 550385000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 16.000000; Exposure Range min 22000, max 358733000;

GST_ARGUS: Running with following settings:
   Camera index = 0
   Camera mode  = 1
   Output Stream W = 2592 H = 1458
   seconds to Run    = 0
   Frame Rate = 29.999999
GST_ARGUS: PowerService: requested_clock_Hz=27216000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
ERROR: from element /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.477949113
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
GST_ARGUS: Cleaning up
GST_ARGUS:
PowerServiceHwVic::cleanupResources
CONSUMER: Done Success
GST_ARGUS: Done Success
Setting pipeline to NULL ...
Freeing pipeline ...

I have also tried to add some capabilities like

gst-launch-1.0 -v nvarguscamerasrc sensor-id=0 \
! 'video/x-raw(memory:NVMM),width=(int)1280, height=(int)720, format=(string)NV12, framerate=(fraction)30/1' \
! nvvidconv \
! 'video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)<FORMAT>' \
! videoconvert \
! "video/x-raw,format=<FORMAT>,width=(int)1280, height=(int)720" \
! v4l2sink device=/dev/video1

With set to NV12, I420 or BGRx.

I have looked at several threads dealing with v4l2loopback. In most (if not all =/…) of them, the version for L4T is older than 32.1… Here is a none exhaustive list of other thread dealing with v4l2loopback:
https://devtalk.nvidia.com/default/topic/1055122/jetson-tx2/duplicating-a-camera-device/
https://github.com/umlaeute/v4l2loopback/issues/195
https://devtalk.nvidia.com/default/topic/1043979/jetson-agx-xavier/building-v4l2loopback-and-getting-the-loop-from-csi-to-dev-video/post/5296316/#5296316

To conclude,
From what I understand:

  • With the first method:

The issue may be due to nvargus-demon.
If it the case, do you think I can use v4l2-ctl to read the camera stream?

  • With the 2nd method:

It’s probably a conflict of capabilities between nvarguscamerasrc and v4l2sink. However I was not able to find any clue to solve it.

PS: I did not put all the logs I have because this post is already way too long ^^”, but if necessary, I can provide extra.

hello wilson.a,

FYI, we don’t officially support docker.

however, your messages shows this should caused by EGL display failure.
you may also narrow down the issue by launching cameras without EGL, please have alternative way to access with v4l2 standard controls.
for example,

$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=100

suggest you also check similar issue, Topic 1042434 for Argus working with docker.
thanks

I am able to run your example inside the container. However, I am not sure to understand how I can use it.

Can you give me a few more details?

I was hoping to use OpenCV with v4l2 backend but I have found few documentations about it. And I would like to avoid writing a v4l2 code in C to capture the camera stream. (I agree, this is more a question for OpenCV forum…)

I have seen this option:

$ v4l2-ctl --help-all
...
-e, --out-device=<dev> use device <dev> for output streams instead of the
                     default device as set with --device
...

I have tried to use it but this is actually not that straight forward for me

Thanks, I have taken a look at it. He is using a previous version of Tegra libs (28.2). I don’t know if anything was “broken” in the last release of Tegra (32.1). Anyway, I will doublecheck the symlinks and tried to compile argus samples.

Would you figure out how to use OpenCV to call v4l2 video? I use leopard IMX274, argus_camera --device=0 works fine on Xavier.

@Duanlinlin,

I have been a bit short in time lately so I was not able to really dig into it and I don’t really know when I will be able to work on it.

So far, I had a look at the OpenCV source for V4L2 backend. It seems possible to use it in C or C++ but I don’t know if there are any bindings for python… However I haven’t tried yet, so this is only a supposition.

On the other hand, I was successfully able to use a USB camera (Logitech C922 Pro Stream Webcam) within my container, with OpenCV through a GStreamer backend using a v4l2src.

[i](Ok that’s confusing for me too, so here is a snippet of the python code I run in my container ^^")

gst_config = ("v4l2src device=/dev/video0 "
               "! video/x-raw, framerate=(fraction)30/1, height=(int)480, width=(int)640 "
               "! videoconvert "
               "! appsink ")
video_capture.open(gst_input, cv2.CAP_GSTREAMER)

[/i]