Emulate a USB camera from computer vision output both internally (v4l2loopback) and externally (as usb slave device)

Hi,
The usecase is a bit complicated and not sure if it can work as expected.

If you would like to get data in host PC, probably you can try UDP streaming. Please take a look at the sample:

It is to get frame data from local video file,access the frames in OpenCV, and streaming out through UDP. If using ethernet is also an option in your usecase, this may be a solution.

I’m not trying to get the data out. I’m trying to use the processed image stream as a usb camera on an Android device running certain applications.

@DaneLLL reminded me: Network streaming won’t look like a camera device, but software on the other computer, if it uses that network streaming protocol, implies you don’t need to create a “virtual” camera. UDP streaming tends to be fairly low on latency.

My issue is I want to present this processed stream to an Android device and be able to run for example zoom and have it pick up this stream as the camera.

Think of streaming services, e.g., videos from youtube.com. Perhaps RTSP? Don’t know, but if you make this look like a camera, then you need a driver for the camera. If it is a standard USB class camera (“UVC”), then probably the driver is already there. In the case of a streaming protocol you don’t even need a camera driver. You would simply need to stream a protocol, and then the various CODECs would deal with it.

Creating a camera which runs in realtime is a much more involved project than simply streaming. Like I said, there are workarounds to not using an isochronous mode camera, but it isn’t ideal.

1. If you want to send opencv processed frame to v4l2loopback (assuming here you have a v4l2loopback dev node /dev/video10, you would use a videoWriter such as:

cv::VideoWriter gst_v4l2sink ("appsrc ! videoconvert ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=/dev/video10", cv::CAP_V4L2, 0, fps, cv::Size (width, height));
if (!gst_v4l2sink.isOpened ()) {
    std::cout << "Failed to open gst_v4l2sink writer." << std::endl;
    return (-8);
}

and in the loop push processed frames at specified fps:

gst_v4l2sink.write(frame_out);

2. For Android host, you may better tell how you would access to Jetson. If connected trough a UDP/IP stack, you may stream your output, encoded as H264 (or H265, VP9…) into RTP over UDP to multicast address :

cv::VideoWriter gst_udpsink("appsrc ! video/x-raw, format=BGR ! queue ! videoconvert ! video/x-raw, format=BGRx ! nvvidconv ! nvv4l2h264enc insert-vui=1 ! video/x-h264, stream-format=byte-stream ! h264parse ! rtph264pay pt=96 config-interval=1 ! udpsink host=224.1.1.1 port=5000 auto-multicast=true", cv::CAP_GSTREAMER, 0, fps, cv::Size (width, height));
if (!gst_udpsink.isOpened ()) {
    std::cout << "Failed to open gst_udpsink writer." << std::endl;
    return (-8);
}

and push in the loop with:

gst_udpsink.write(frame_in);

You would read on remote host with a sdp file such as:

m=video 5000 RTP/AVP 96
c=IN IP4 224.1.1.1
a=rtpmap:96 H264/90000

Such as:

cvlc -v test.sdp 

3. Or you may try RTSP (this require package libgstrtspserver-1.0-dev and test-launch example being built). This would imply opencv sending to shmsink through gstreamer, and test-launch serving RTSP from shmsrc:
cv::VideoWiter to shmsink:

cv::VideoWriter h264_shmsink ("appsrc is-live=true ! queue ! videoconvert ! video/x-raw, format=RGBA ! nvvidconv ! omxh264enc insert-vui=true ! video/x-h264, stream-format=byte-stream ! h264parse ! shmsink socket-path=/tmp/my_h264_sock ", cv::CAP_GSTREAMER, 0, fps, cv::Size (width, height));
if (!h264_shmsink.isOpened ()) {
    std::cout << "Failed to open h264_shmsink writer." << std::endl;
    return (-2);
}

push in loop processed frame to h264_shmsink.

So now you would launch your RTSP server with:

./test-launch "shmsrc socket-path=/tmp/my_h264_sock do-timestamp=true ! video/x-h264, stream-format=byte-stream, width=640, height=480, framerate=30/1 ! h264parse ! video/x-h264, stream-format=byte-stream ! rtph264pay pt=96 name=pay0 "

Note that you would have to manage the socket cleaning if any when closing client and server.

and you should be able to receive on host:

cvlc -v rtsp:<jetson_IP>:8554/test 
2 Likes

The goal of the processing on the Jetson is to provide some enhanced capability camera feed for video calls, not just to stream the data. So I need to use zoom or similar already available application.

If you understand what I’ve posted all 3 are opencv processed outputs, so you may may do whatever you want from the camera image before sending resulting frame.
Someone else may better understand your expectations.

To clarify, if the Jetson can present as USB camera on micro usb, zoom or other video calling application can see it and use it. Are you suggesting receive the stream on the Android device over the network and implement the v4l2loopback device on the Android device?

it sounds as Augmented Reality [AR] camera through Zoom? else?
In my opinion, if using the example above that outputs the processed steram via v4l2loopback to the emulated device /dev/video10 from original e.g. /dev/video0, it will become possible to select in Zoom or chromium browser between the camera devices /dev/video0 or /dev/video10 so it will allow either to stream the original video or processed video [AR]. @cloud9ine is that what you are wishing to do?
Moreover, by default the Jetson will present any USB camera in web browser as a video source for zoom etc.
But I did not try with android.
I remember there were some methods that I tried in the past that resulted in mounting video camera devices /dev/video* over the network as /dev/video* at some other device, but I am not sure what was the performance in such scenario.
ref: Can I pipe /dev/video over ssh - Unix & Linux Stack Exchange
gstreamer - How to access a security camera and covert it to /dev/video? - Unix & Linux Stack Exchange

Not an answer, but something important to keep in mind: The “gadget” API is only a generic framework for “standard” USB devices not needing any extra functionality. This means that if the UVC class camera supports functions such as zoom, then this would work, but it also implies that the software is not performing zoom…this would be a function directly in the camera. I don’t know if a UVC camera has direct support for zoom, but I doubt it does…this is generally the realm of either (A) software zoom, or (B) a custom camera driver.

As soon as you add a function not found in the USB UVC driver you can no longer use the USB gadget API. As soon as you implement zoom and other functionality in software it means you can stream without special support. I honestly do not know if there is such a thing as zoom support in the gadget API (meaning at the camera level, not at the software viewing level).

If you have a camera connected to the Nano which has some sort of zoom, e.g., perhaps it has an optical zoom (versus software zoom), and I am guessing that zoom uses a separate serial UART or i2c protocol to control the lens (the camera could still be UVC, but the lens zoom would perhaps be separate software). If there is some sort of hardware function for zoom which is separate from the UVC driver, then it implies you would still need to create a custom driver for the zoom. That driver could be communicated with via the same USB cable (USB allows more than one device on a cable).

You may need to give some exact details of your camera hardware and how it works on Linux for local use. It is difficult to answer not knowing specific details of the camera itself.

Sorry, by zoom I meant the video conferencing app zoom, not video zoom.

Can you provide some details on the exact video conferencing app? Knowing more about the app’s capabilities (such as direct camera viewing or an ability to view streaming protocols) would help.

I’m looking for wide compatibility with multiple apps - mainly zoom and Google duo.

Do you have a URL for those apps which provides details about what the apps do and what they require?

Hi linuxdev, sorry for my response here. Here are the google play store links for all three apps:

I could possibly use these services from the browser or a linux client on the jetson but I am trying to use android to demonstrate the flexibility of my demo. To make this happen, I am trying to expose the processed stream as a USB camera to the android device. I have confirmed that such a camera shows up in these apps.

in order to run google hangouts, webskype etc on nvidia jetson you could use v4l2loopback VizioChron - Nano

sudo su
cd /usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9
mkdir v4l2loopback
git clone https://github.com/umlaeute/v4l2loopback.git v4l2loopback
cd v4l2loopback && git checkout -b v0.10.0
make
make install
apt-get install -y v4l2loopback-dkms v4l2loopback-utils
modprobe v4l2loopback devices=1 video_nr=2 exclusive_caps=1
echo options v4l2loopback devices=1 video_nr=2 exclusive_caps=1 > /etc/modprobe.d/v4l2loopback.conf
echo v4l2loopback > /etc/modules
update-initramfs -u

TERMINAL 1:

gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080, framerate=30/1' ! nvvidconv ! 'video/x-raw, width=640, height=480, format=I420, framerate=30/1' ! videoconvert ! identity drop-allocation=1 ! 'video/x-raw, width=640, height=480, format=RGB, framerate=30/1' ! v4l2sink device=/dev/video2

TERMINAL 2:

export DISPLAY=:0
chromium-browser

Unfortunately those applications won’t directly run on on arm64 for Linux (the arm64 they have is for Android). So you do need something like what @Andrey1984 suggests.

An alternative is to create some sort of streaming service on the Jetson, and an adapter software on the host PC to make the stream appear to be a camera. Either way you’d be emulating a camera, but I think emulating a camera on the host PC has a better chance than emulating a camera on the Nano. The Nano should be quite good with something streaming, e.g., v4l2 or RTSP, but to emulate a camera over USB on a Jetson is going to have some issues. I really suggest looking at @Andrey1984’s comments.

Note that a Nano cannot emulate a device in isochronous mode. Isochronous mode is the mode where bandwidth is close to real time, and is typically used for streaming of data. Audio and video is best in isochronous mode.

You can create a camera emulation in bulk mode, but bulk mode isn’t the best of choices.

Sorry I wasn’t clear. Android is not running on the Jetson. These apps are running on an odroid N2 board and I want the Jetson to act as a USB video device to the Odroid board.

For now, as a workaround, I’m outputting to HDMI and using an HDMI to USB video capture device converter which shows up to the Odroid board like a camera. I was just looking to see if there was a way to do it directly from the Jetson without going through this intermediary device.