cannot stream 2 USB-cameras: No space left on device (28)

i’m trying to stream 2 identical USB-cameras.

any single camera works fine, but never both.

even in the lowest 640x480 resolution the 2nd camera will always fail.

when i start the 2nd one, i get the error:

./camera_v4l2_cuda -d /dev/video0
ERROR: start_stream(): (line:476) Failed to start streaming: No space left on device (28)

the openCV camera sample also works with one camera only, but whenever i start the 2nd i get the same error basically

VIDIOC_STREAMON: No space left on device

jetson_clock is on

is the any way to use 2 cameras on Jetson Nano?

1 Like

Hi,
We don’t observe the issue with 4K USB Camera board | 13MP 4K board camera
Seems specific to your cameras. You may try other USB cameras.

What do you see from “df -H /”? If the file system is full, then there may be some other content you can remove.

What V4L nodes are your cameras ? You would launch first camera on /dev/video0, but second one on /dev/video1…You may tell more.

the cameras are “USB3.0 Sony IMX291 Camera” from ELP

I’ve asked ELP if they can fix them.

Hi,
I also have this issue.
iv’e connected 4 usb cameras and used python & openCV to read frames. I manage to get frames from the first two cameras but for last two I get “VIDIOC_STREAMON: No space left on device” .
My cameras are also from ELP, However when i connect these cameras to my PC i can read frames from all 4 cameras with no problem.

What do i do?

Thanks,
Boaz

See #3.

Defently not a disk full problem, df shows i’m using 40%~ of disk.
In https://devblogs.nvidia.com/jetson-nano-ai-computing/ Nvidia showed they processed 8 HD streams simultaneously.
I’ve tried following their installation instructions but no luck for now.
It maybe because in their system layout figure they have a HW component called “PCIE to Gbe x8 switch and POE injector” which i’m guessing allows jetson to receive 8 HD streams and their system to work.
Does anyone knows where do i buy this component?

Thanks,
Boaz

One more disk space test, due to this:

ERROR: start_stream(): (line:476) Failed to start streaming: <b>No space left on device</b> (28)

Check if inodes are full (file system could have spare space but all nodes used if there are a lot of small files). What do you see from:

df -H -i

Hi,
Thanks for your help.

df -H -i gives 10% usage on /dev/mmcblk0p1 and the rest shows 1% usage
df -H gives 45% on /dev/mmcblk0p1 and the rest between 0% and 4%

This makes me wonder why it complained about “no space left on device”. Perhaps it uses a ramdisk and didn’t have enough RAM, don’t know.

“no space left on device” in this case refers to USB memory bandwidth. The Jetson Nano has one USB 3.0 hub (built-in). If you are trying to use two USB 3.0 cameras at high resolution, then there may not enough memory bandwidth available to accommodate multiple full streams.

As I recall, depending on the camera initialization code the USB drivers will try to reserve the full bandwidth of the highest resolution stream from any given camera. For a high resolution camera, this can mean that it will take up more than half the available USB bandwidth. So it can be difficult to run 2 USB 3 high rez cameras at once full tilt. Sure, there are multiple USB connectors on the Nano, but they all go though the same hub bottleneck.

You may be able to get it to work by specifying lower resolutions on the cameras, but it depends on a lot on how things were implemented in the particulars.

On a PC, usually they have more bandwidth available as they have the room and power to be more generous in the USB department. Generally they have more USB bandwidth available.

That makes sense. I’ll consider it just a bad error message, but definitely there are a lot of cameras out there (e.g., Zed stereo) which can’t even handle something like a mouse on the same root_hub. I like the idea of trying at lower resolutions just to see if the error goes away.

Thanks Kangalow, I guess nano isn’t the right HW for my problem. Next time i’ll read more carfully posts that sounds too good to be true (e.g using nano=100$ of HW to sample 8 HD streams) . live and learn :)

Not sure I understand this comment. The cameras you are using appear to be 4K (equivalent to ~4 HD streams), which means that they try to reserve the equivalent of 8 HD streams through one USB hub. That saturates the bandwidth of a USB 3.0 hub, and then some.

Typically if one were trying to do this, there would be a different strategy. One approach is to use a custom carrier board, which would solve the issue.

Another approach is to use the current carrier board. Use the Ethernet port for IP cameras (possibly in addition to using USB camera(s) ). One might even consider additionally using the M2 Key E slot (with an appropriate adapter) to add another way for directing camera streams into the Jetson Module.

While it’s true that there is not an immediately obvious on how to get 8 HD camera streams into the Nano by just plugging cameras in, as a developer that shouldn’t be surprising. It also shouldn’t be much of a surprise that it might take work to trick a 4K camera into acting like it is a HD camera. The less powerful the hardware, generally the more you need to know in order to make it work and be performant.

Hi,
Please check this post:
https://devtalk.nvidia.com/default/topic/1067850/deepstream-sdk/connected-more-than-two-usb-cameras-problem-on-deepstream-app-jetson-nano-dev-kit-/post/5410823/#5410823
Please use USB cameras that can be enumerated in SuperSpeed.

there is a combination of 3 issues causing the error

I can run single USB3 ELP camera on Nano:
YUYV 1920 x 1080 @30 fps = 125 MB/s

but never 2 cameras, even with the lowest bandwidth:
YUYV 640 x 480 @15 fps = 10 MB/s

-) ELP USB3 camera claims unnecessary USB bandwidth, hence the error
I’ve contacted ELP but they never replied.

-) the second problem is Linux, it blindly trusts whatever camera tells.
On my laptop I can boot from Windows 10 or Ubuntu 18.
Linux fails miserably with the lowest resolution.
Windows just get the job done. Two cameras run smoothly at the highest resolution/fps.
exactly the same cameras, exactly the same USB3 port, exactly the same hardware.

there is UVC_QUIRK_FIX_BANDWIDTH, but it doesn’t work.

-) the root cause of the error is actually the unfortunate dev kit design. I would use MIPI CSI cameras w/o USB overhead and silicon would handle it, but there is just one connector on my board. Off the shelf are USB3 or Ethernet cameras and I’m forced to choose between bad and worse.

Hi,
FYR, we have partners offering carrier boards for Jetson Nano.
https://devtalk.nvidia.com/default/topic/1051385/jetson-nano/jetson-nano-dev-board-availability-and-lifecycle/post/5336301/#5336301

Hi,
I Thank all of you for all your answers!

maxthedread: Do you think that if try out different USB cameras i’ll can find one that i can it’s FPS to 1~ with MJPG compression at 1920X1080? In that case you think i’ll be able to open 8 streams?
will SuperSpeed cameras help in this case? will linux be ok as an OS? (windows is not an option)

About Ethernet cameras, is there a cheap way to acquire 8 photos almost simultaneously and than repeat the process 1-2 seconds later? By cheap i mean extra HW (besides the cameras) that’s under 100$.

About MIPI cameras, if what i’m trying to do is take 8 photos almost simultaneously, are they an option for me? From what i understand so far they work in serial, so i’ll have to take 8 photos one by one, how long should that take?

Sorry for not being clear about my use case, I’m a newbie to HW. it’s not the standard video sampling use case. what i actually need is a FPS of 1~ but for 8 cameras simultaneously.

Thanks again,
Boaz

DaneLLL: thank you, I’ll checking it

boaz.petersil: sorry, I don’t know.

I’d switch naturally to still images rather than video in your scenario. Then I have full control when to trigger 8 snapshots from software, have more choices for a frame size, better image quality and probably additional photo-options. When and in which sequence 8 pictures land on the chip, latency and bandwidth would be not so important then.

But I’m not quite sure if Linux understands still images.