Usb_camera.ipynb on Jetson Xavier NX in the DLI Getting Started with Jetson no video just black and white shades

Im doing the Deep Learning Institute Getting Started with Jetson course.
In the course with the Jetson Xavier NX you use

This course primarily uses the Jetson 2GB Nano with Jetpack 4.5.1
But im using the Jetson Xavier NX to do it.

here’s the docker command i ran to get the course running…
sudo docker run --runtime nvidia -it --rm --network host --volume ~/org/nvdli-data:/nvdli-nano/data --device /dev/video0 nvcr.io/nvidia/dli/dli-nano-ai:v2.0.1-r32.5.0

but in the usb_camera.ipynb notebook im only getting black and white changes when i move the camera around so its seeing something but just black and white.

Heres the code that DOES NOT work

This code runnning from jupyter notebook only produces black and white color no video just color changes when i put something in front of the camera

from jetcam.usb_camera import USBCamera
camera = USBCamera(width=224, height=224, capture_width=640, capture_height=480, capture_device=0)
image = camera.read()
print(image.shape)
print(camera.value.shape)

import ipywidgets
from IPython.display import display
from jetcam.utils import bgr8_to_jpeg
image_widget = ipywidgets.Image(format=‘jpeg’)
image_widget.value = bgr8_to_jpeg(image)
display(image_widget)

here is code that DOES works .

import cv2
cam = cv2.VideoCapture(0)
ret, frame = cam.read()
cv2.imwrite(“test.png”, frame)

here is the page ive been led me to believe its ok to buy a jetson Xavier to do your course.

“*Any Jetson can be used to complete the course (except Jetson TK1).”

I tried both the c270 logitech camera and the C920Pro logitech. both same results.

i tried to use the Jetson Nano instead of the Xavier NX and it works fine, same jetpack 4.5.1 and the c270.

its just the Xavier NX having problem with the jupyter notebook.

here is listing of the video devices on the Jetson 2GB

ls -la -ltrh /dev/video*
crw-rw----+ 1 root video 81, 0 Apr 10 09:51 /dev/video0

on the Xavier NX it DOES NOT have the plus sign.

i read somewhere that the plus sign means there are some ACLs for the file
here are the ACL on the jetson nano.

via man page ‘ls’

“If the file or directory has extended security information, the permissions field printed by the -l option is followed by a ‘+’ character.”

This generally means the file is encumbered with access restrictions outside of the traditional Unix permissions - likely Access Control List (ACL).

so i did a getfacl on the nano…

getfacl /dev/video0
getfacl: Removing leading ‘/’ from absolute path names

file: dev/video0

owner: root

group: video

user::rw-
user:gdm:rw-
group::rw-
mask::rw-
other::—

reinstalling my xavier to compare…

not sure what i did but it works now.

all i did was switch to a 64gb evo sdcard and formatted it through sd card formatter in windows and then etcher
of course with the xavier nx image.

then i ran
sudo docker run --runtime nvidia -it --rm --network host --volume ~/org/nvdli-data:/nvdli-nano/data --device /dev/video0 nvcr.io/nvidia/dli/dli-nano-ai:v2.0.1-r32.5.0

on the os itself…

ls -la -ltrh /dev/video*
crw-rw----+ 1 root video 81, 0 Jan 28 2018 /dev/video0

the plus sign is magically there now.

ill try to add NVIDIA EGX again… the kubernetes thing you have with the following directions…

sudo docker info | grep Runtime
sudo docker info | grep -i runtime

Runtimes: nvidia runc
Default Runtime: runc

sudo systemctl daemon-reload && sudo systemctl restart docker

Runtimes: nvidia runc
Default Runtime: nvidia

sudo systemctl start docker && sudo systemctl enable docker

$ sudo apt-get update && sudo apt-get install -y apt-transport-https curl
$ curl -s https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add -
$ sudo mkdir -p /etc/apt/sources.list.d/

sudo vi /etc/apt/sources.list.d/kubernetes.list

deb https://apt.kubernetes.io/ kubernetes-xenial main

sudo apt-get update
sudo apt-get install -y -q kubelet=1.17.5-00 kubectl=1.17.5-00 kubeadm=1.17.5-00
sudo apt-mark hold kubelet kubeadm kubectl

sudo swapoff -a

sudo kubeadm init --pod-network-cidr=10.244.0.0/16

seems after i do

sudo docker info | grep Runtime
sudo docker info | grep -i runtime

Runtimes: nvidia runc
Default Runtime: runc

sudo vi /etc/docker/daemon.json
and add “default-runtime” : “nvidia”

{
“runtimes”: {
“nvidia”: {
“path”: “nvidia-container-runtime”,
“runtimeArgs”:
}
},
“default-runtime” : “nvidia”
}

sudo systemctl daemon-reload && sudo systemctl restart docker

Runtimes: nvidia runc
Default Runtime: nvidia

that the video device comes back without the plus sign for the ACLs

ls -la -ltrh /dev/video*
crw-rw----+ 1 root video 81, 0 Jan 28 2018 /dev/video0

but even though that happens the video still works

seems to work now… i dont know what the difference is.
maybe its because i had the camera plugged in when i first booted. oh well.

Hi @juan.suero, glad that you were able to get it working, not sure either what the issue was. In the future if you continue having issues with that camera in container let us know. I use both the C920 and C270 on Xavier NX and Nano.