Issues downloading and running docker for "Getting Started with AI" DLI course

Hi All;

I’ve having trouble running the docker for the “Getting Started with AI on Jetson Nano (V2)”. I’m following the instructions, and so far everything works until I try to run the conatiner.

In my case, I want to run it with the CSI camera, so this is the script I run:

$ sudo docker run --runtime nvidia -it --rm --network host \
--volume ~/nvdli-data:/nvdli-nano/data \
--volume /tmp/argus_socket:/tmp/argus_socket \
--device /dev/video0 \
nvcr.io/nvidia/dli/dli-nano-ai:latest

I get an error saying:

docker container run requires at least 1 arguement

Can someone please help advise what I’m doing wrong?

Thanks!

I tried again and it worked with the following tag:

v2.0.0-r32.4.3

Instead of the instruction found in the
DLI Getting Started with AI on Jetson Nano. There it says:

latest

What’s confusing is that in the course, it points the website to address what tag you should use. The correct tags are actually found in the examples found in the course, not the docker site.

Hi,

Sorry about this.

We can also reproduce this in our environment.
Will update this issue to our internal team.

Thanks a lot for reporting this.

Thanks for reporting this.
We are working on the document fix for this issue.

The instruction is updated.

1 Like

Hi - I had all sorts of problems getting my picam 2.1 to work (noob)

I used an amalgamation of icornejo.a’s code and the v2.0.0-r32.4.3 tag. I believe there should be a backslash after the –volume /tmp/argus_socket:/tmp/argus_socket line so the code that worked for me was:

sudo docker run --runtime nvidia -it --rm --network host \
--volume ~/nvdli-data:/nvdli-nano/data \
--volume /tmp/argus_socket:/tmp/argus_socket \
--device /dev/video0 \
nvcr.io/nvidia/dli/dli-nano-ai:v2.0.0-r32.4.3

Thought I’d copy it in full for anyone else trundling by in my predicament. Thanks!

1 Like

Thanks man!

I just tried and that worked. Honestly, I gave up after the CSI issue as well. I was pulling out my hair figuring out what was wrong with connecting, and then being disapointed that the camera wasn’t working inside the container was defalting.

I just tried it again with your code, it works!

Thanks man, I think this is really going to help those of us using the CSI cam.

Cheers!

2 Likes

Sorry about that - we have since updated the DLI course documentation with the info about using these camera arguments to launch the DLI container.

1 Like

dusty_nv and stinkeycar and icornejo.a

Can you help a noob? Thank you for the clarification. I tried your --volume addition after many days of running:
from jetcam.usb_camera import USBCamera
#TODO change capture_device if incorrect for your system
camera = USBCamera(width=224, height=224, capture_width=640, capture_height=480, capture_device=0)
Error messages are:

RuntimeError Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/jetcam-0.0.0-py3.6.egg/jetcam/usb_camera.py in init(self, *args, **kwargs)
23 if not re:
—> 24 raise RuntimeError(‘Could not read image from camera.’)
25
RuntimeError: Could not read image from camera.

During handling of the above exception, another exception occurred:
RuntimeError Traceback (most recent call last)
in
2
3 #TODO change capture_device if incorrect for your system
----> 4 camera = USBCamera(width=224, height=224, capture_width=640, capture_height=480, capture_device=0)
/usr/local/lib/python3.6/dist-packages/jetcam-0.0.0-py3.6.egg/jetcam/usb_camera.py in init(self, *args, **kwargs)
26 except:
27 raise RuntimeError(
—> 28 ‘Could not initialize camera. Please see error trace.’)
29
30 atexit.register(self.cap.release)
RuntimeError: Could not initialize camera. Please see error trace.

Thanks

Hi @vludwig, are you trying to use a USB camera or a MIPI CSI camera?

The --volume /tmp/argus_socket:/tmp/argus_socket argument is needed for CSI camera only.

If you are trying to use USB camera, did you launch the container with --device /dev/video0?

dusty_nv, thanks so much for the quick reply.

  • I have both a CSI and a USB camera hooked up. Have been trying to go with whichever one runs, results in an image.
  • Yes, I did run —device /dev/video0 \ , scripts as follows:

For LI 136FOV (CSI) camera, I ran the following on my Mac terminal through ssh vern@192.168.55.1:

sudo docker run --runtime nvidia -it --rm --network host \

–volume ~/nvdli-data:/nvdli-nano/data
–device /dev/video0
nvcr.io/nvidia/dli/dli-nano-ai:v2.0.0-r32.4.3
allow 10 sec for JupyterLab to start @ http://10.0.0.135:8888 (password dlinano)
JupterLab logging location: /var/log/jupyter.log (inside the container)

Then, I launched Jupyter, 192.168.55.1:8888, + dlinano password and ran:

!ls -ltrh /dev/video*

crw-rw---- 1 root video 81, 0 Oct 21 22:56 /dev/video0

Then, also in Jupyter, I ran:

Continuation of comment 10:
from jetcam.usb_camera import USBCamera

#TODO change capture_device if incorrect for your system
camera = USBCamera(width=224, height=224, capture_width=640, capture_height=480, capture_device=0)

RuntimeError Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/jetcam-0.0.0-py3.6.egg/jetcam/usb_camera.py in init(self, *args, **kwargs)
23 if not re:
—> 24 raise RuntimeError(‘Could not read image from camera.’)
25

RuntimeError: Could not read image from camera.

During handling of the above exception, another exception occurred:

RuntimeError Traceback (most recent call last)
in
2
3 #TODO change capture_device if incorrect for your system
----> 4 camera = USBCamera(width=224, height=224, capture_width=640, capture_height=480, capture_device=0)

/usr/local/lib/python3.6/dist-packages/jetcam-0.0.0-py3.6.egg/jetcam/usb_camera.py in init(self, *args, **kwargs)
26 except:
27 raise RuntimeError(
—> 28 ‘Could not initialize camera. Please see error trace.’)
29
30 atexit.register(self.cap.release)

RuntimeError: Could not initialize camera. Please see error trace.

[Note: after Kernel Shut Down, in a separate/new session, I ran “argus_socket" string in csi_camera.ipynb. Essentially same error message as above (for /csi_camera.py]

Thanks.
@vludwig

Ah, okay - if you have both CSI and USB camera hooked up, launch your container with --device /dev/video1 also.

Then change this line in the notebook to reflect capture_device=1

camera = USBCamera(width=224, height=224, capture_width=640, capture_height=480, capture_device=1)
1 Like

dusty_nv:

Not sure how to launch my Docker container with --device /dev/video1
When I run the query line: !ls -ltrh /dev/video*
The response I get is: crw-rw---- 1 root video 81, 0 Oct 21 22:56 /dev/video0

How do I change my Docker container line in my notebook to --device /dev/video1

Thanks.

Vern

You would add --device /dev/video1 to the docker run command that you use when launching the container:

sudo docker run --runtime nvidia -it --rm --network host \
--volume ~/nvdli-data:/nvdli-nano/data \
--volume /tmp/argus_socket:/tmp/argus_socket \
--device /dev/video0 \
--device /dev/video1 \
nvcr.io/nvidia/dli/dli-nano-ai:v2.0.0-r32.4.3

This command you use to launch the container, it is not in one of the notebooks.

When you run the query line inside of the notebook, you should then see /dev/video1 show up also. Then remember to change capture_device=1 in the notebook cell.

1 Like

dusty_nv,

Added line --device /dev/video1 \ to docker run string. video1 does not appear.
Terminal session after ssh login:

Last login: Thu Oct 22 16:35:17 2020 from 192.168.55.100
vern@vern4:~$ sudo docker run --runtime nvidia -it --rm --network host --volume ~/nvdli-data:/nvdli-nano/data --device /dev/video1 nvcr.io/nvidia/dli/dli-nano-ai:v2.0.0-r32.4.3
[sudo] password for vern:
allow 10 sec for JupyterLab to start @ http://10.0.0.135:8888 (password dlinano)
JupterLab logging location: /var/log/jupyter.log (inside the container)
root@vern4:/nvdli-nano#

then, In Juypter, At query, !ls -ltrh /dev/video*
Response is still: crw-rw---- 1 root video 81, 0 Oct 22 02:50 /dev/video0

Thanks so much for your tracking this.

@vludwig

Ok, hmm - if you run ls /dev/video* from the host device (outside of container), does it show /dev/video1? What devices are listed?

Have you been able to confirm that you can view your USB webcam outside of the container, from an application like Cheese or similar? What USB webcam do you have?

Dusty_nv,

A progress report.

I am currently using a Mac as Host (not a pc, so not using Cheese, so far).
As expected, outside the Docker container in Host terminal session without secure shell into the NVIDIA Jetbot, /dev/video0 and /dev/video1 do not show up in the Host. Within secure shell, both appear. See terminal session:

Last login: Fri Oct 23 13:37:07 on ttys000

(base) Verns-iMac-9:~ vernludwig$ ls /dev/video*

ls: /dev/video*: No such file or directory

(base) Verns-iMac-9:~ vernludwig$ ssh vern@192.168.55.1

vern@192.168.55.1’s password:

Welcome to Ubuntu 18.04.4 LTS (GNU/Linux 4.9.140-tegra aarch64)

This system has been minimized by removing packages and content that are

not required on a system that users do not log into.

To restore this content, you can run the ‘unminimize’ command.

286 packages can be updated.

194 updates are security updates.

Last login: Fri Oct 23 13:37:40 2020 from 192.168.55.100

vern@vern4:~$ ls /dev/video*

/dev/video0 /dev/video1

Also, my USB camera is a Logitech C-910 Carl Zeiss Teaser. When it is plugged directly into my Mac USB 3.1 bus, I can see it listed in my system report and view its image in FaceTime.

When the USB camera is plugged into “Linux for Tegra” (NVIDIA), the “Linux for Tegra” device is listed in USB 3.1 Bus. However,I cannot see/gain access to this Logitech USB camera image in FaceTime while is ssh mode.

@vludwig

OK, can you try running this command from the Jetson to launch the container again?

Then once the container is launched, run ls /dev/video* from the container - outside of JupyterLab. Just run it from the container’s shell.

I wonder if when you tried this before if something was amiss, because you ran it with --device /dev/video1, but /dev/video0 showed up in the container even though you didn’t map that… so probably worth try it again. Make sure you container is exited first before relaunching with the command above.

Both video cameras are showing up now.

[1]:

crw-rw---- 1 root video 81, 4 Oct 23 20:44 /dev/video1
crw-rw---- 1 root video 81, 0 Oct 23 20:44 /dev/video0

However, I don’t understand the last line of your response, "Make sure you container is exited first before relaunching with the command above.” Does that exiting result from closing the terminal ssh session before executing the command (below) in Juypter? Or, is it this done by Shut Down Kernel?

@vludwig

dusty_nv,

I have found a specific problem. My from string/command cannot access USBCamera in the location /var/mail/…

So, Last login: Sun Oct 25 15:24:06 2020, both video0 and video1 show in the list:
$ !ls -ltrh /dev/video*

crw-rw---- 1 root video 81, 4 Oct 25 19:31 /dev/video1
crw-rw---- 1 root video 81, 0 Oct 25 19:31 /dev/video0

When I then run:

$ from jetcam.usb_camera import USBCamera

When I search the file: /var/mail, there is no content in that file, and no jetcam.usb_camera in /var/lib. So, there are no files in the called address.
I’ve looked through the other /var subfiles and cannot find jetcam.usb_camera, or USBCamera

Where is the USBCamera file/script? And how do I access it?

@vludwig