And tested with two Jetson Nanos as the WebRTC clients, each with an IMX219-120 8MP CSI camera, and also used a TX2 as the server. The three were recently flashed in late Dec. 2019 with Jetpack 4.3 (matching the L4T R32.3.1) and connected via 802.11 AC WiFi.
On both Nanos, there were this kind of error messages popped up:
(peerconnection_client:26719): Gtk-WARNING **: 00:38:29.888: drawing failure for widget ‘GtkWindow’: invalid value for stride
Hi,
WebRTC Framework is initially developed to enable USB camera to work with x86 PC/Laptop. We enable it on Jetson platforms for same usecase. CSI camera source may not be included in the scope.
For the request of open source, we have prebuilt libwebrtc.a, video_loopback, modules_tests, peerconnection_server, peerconnection_client in the package. Please share your usecase, the reason you cannot do the implementation, and which prebuilt binary should be open source.
The CSI camera I was using has 4K resolution. Under the constraint you mentioned, no problem, I can switch to use 1080P USB webcams.
About my request for open sourcing the code, it was about the peerconnection_client, peerconnection_server, and video_loopback binary files. Looking for understanding their code design and implementation. They are not provided with associated documentation, which can definitely help learn better.
As to my use case, it is to leverage DeepStream SDK/framework and Jetson Nano for IOT IVA. My dissertation research is about “Restricted Boltzmann Machine Learning Method for Autonomous Video Telephony Quality Assessment” for helping Americans with hearing disability who rely on American Sign Language for communication.
Please also try to run the reference config files and check the document. The existing implementations should cover most usecases and you can check if you can get your usecase running by simply modifying the config file. This should speed up your development. The latest release is DS4.0.2
I have a question related to the ones from CJLiu20152.
I would like to pilot my Jetbot from a distant web browser and for that I’d need a low latency video feed solution. The robot has a Raspberry camera V2 connected through CSI to the jetson nano.
To do that with a CSI camera connected to a Raspberry Pi based robot, I used the great UV4L WebRTC streaming server: https://www.linux-projects.org/uv4l/. Thanks to WebRTC, it allows to display the camera feed on the web browser with extremelly low latency (less than 80ms in a local network and less than 200ms from the internet @720p 15fps).
I guess this use case is pretty common, that’s why I think a good solution might already exist for the Jetbot.
If not, what would be the best way to achieve such a low latency with the Jetson Nano.
Using WebRTC, but as you said only USB camera are supported on the Jetson Nano… Driver need to be modified? Is it planned?