GStreamer Pipeline works on Desktop but not on Nano


I am trying to make a set of gstreamer pipelines for resizing a camera’s videofeed into several different resolutions.
I use the v4l2loopback module to create ‘virtualised’ cameras. The idea is:

With a physical (usb) camera (PC1) we can:
Create several virtual cameras (VC1, VC2, VC3)
Forward the video feed from the physical camera to the first virtual one (VC1)
Have two processes reading from VC1 that resize the video:

  • 1280x720 and write to VC2
  • 640x480 and write to VC3

I have created the pipeline and have verified that it works on a Desktop machine (i.e. x86) but the same pipeline gives errors when running on the Jetson Nano (as well as a RPi3 or Coral Dev Board).

Here is the set of commands I use:

  1. Run 3 Virtual Cameras sudo modprobe v4l2loopback devices=3 max_buffers=2

  2. Forward Physical camera feed to first virtual camera (VC1)

    gst-launch-1.0 v4l2src device=/dev/video0 ! v4l2sink device=/dev/video1

  3. Run 2 Gstreamer apps to resize the feed from VC1, forwarding to VC2 and VC3

    gst-launch-1.0 v4l2src device="/dev/video1" ! videoscale ! "video/x-raw, width=1280, height=720, format=(string)YUY2" ! v4l2sink device=/dev/video2
    gst-launch-1.0 v4l2src device="/dev/video1" ! videoscale ! "video/x-raw, width=480, height=360, format=(string)YUY2" ! v4l2sink device=/dev/video3
  4. Open VC2 and VC3 to verify they’re running correctly

    gst-launch-1.0 v4l2src device="/dev/video3" ! videoconvert ! ximagesink

In this scenario, it all works on a Desktop, but Step 3 fails when ran on the Jetson, with the following message:
This step fails on the Jetson Nano with the following error

ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPiperline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason error (-5)

As I am a novice when it comes to GStreamer, I am not sure how to approach solving the problem.
The GStreamer version is 1.14.5
Jetpack version is 4.4.1 with L4T being 32.4.4

I was wondering if it is a software problem, perhaps the build of GStreamer, or maybe its a hardware limitation of the ARM boards?

Any suggestions are appreciated

Please refer to
Jetson Nano FAQ
Q: I have a USB camera. How can I launch it on Jetson Nano?

In your pipeline, framerate is not set. Suggest you set width,height,format,framerate fitting exactly one mode in the capability.

You may try adding ! identity drop-allocation=true ! before v4l2sink in your pipeline.

Setting the framerate did the trick! Now it is working, thank you!

Interestingly, different cameras behave differently with GStreamer. For example, the following works fine on a Logitech C920 and an ODroid camera:

  1. Forward physical camera to virtual sink 1
gst-launch-1.0 v4l2src device=/dev/video0 ! v4l2sink device=/dev/video1
  1. Resize feed from virtual sink 1, writing to virtual sink 2
gst-launch-1.0 v4l2src device="/dev/video1" ! videoscale ! videorate ! \
  "video/x-raw, width=480, height=360, framerate=10/1, format=(string)YUY2" ! \
  v4l2sink device=/dev/video2

Then, as expected, we can access both /dev/video1 at full resolution, and /dev/video2 at 480x360

However, using an A4Tech PK-910H camera, the first one of these 2 commands doesn’t work properly on the Jetson (it does work on a Desktop). Instead, I need to use a command like:

gst-launch-1.0 v4l2src device="/dev/video0" ! videoscale ! videorate ! \
  "video/x-raw, width=1280, height=720, framerate=25/1, format=(string)YUY2" ! \
  v4l2sink device=/dev/video1

In other words, giving some setting for video resolution and framerate straight away to the physical camera. This is not necessarily a problem but I am interested to understand why it is happening.