WebRTC on Jetson Nano


I have a Jetson Nano connected to the 4k 360 camera, and I’m using GStreamer (GST-RTSP Server) to stream that video.

Now, I want to use different Jetson Nano to capture and display that video in the browser.

I tried webrtc-streamer from this repo https://github.com/mpromonet/webrtc-streamer on my PC and it works.
But I can’t build it on the Nano.

Host architecture arm64 is not supported.

Is there any way to build this on the Jetson Nano, the issue section is not helpful…
Or is there any other way to achieve this?

On the streaming side I’m running:

./test-launch "v4l2src device=/dev/video0 io-mode=2 ! image/jpeg,width=3840,height=2160,framerate=5/1 ! nvjpegdec ! video/x-raw ! omxh264enc ! rtph264pay name=pay0 pt=96"

Please check

For more information, you are working on both Jetson Nano and TX2?

Yes, it would be great if I could use two Nano’s, but Nano for streaming and TX2 for WebRTC works as well.


Happy New Year!

I downloaded NVIDIA Hardware Acceleration in the WebRTC Framework from this hyperlink https://developer.nvidia.com/embedded/dlc/r32-3-1_Release_v1.0/t186ref_release_aarch64/WebRTC_R32.3.1_aarch64.tbz2

And tested with two Jetson Nanos as the WebRTC clients, each with an IMX219-120 8MP CSI camera, and also used a TX2 as the server. The three were recently flashed in late Dec. 2019 with Jetpack 4.3 (matching the L4T R32.3.1) and connected via 802.11 AC WiFi.

On both Nanos, there were this kind of error messages popped up:
(peerconnection_client:26719): Gtk-WARNING **: 00:38:29.888: drawing failure for widget ‘GtkWindow’: invalid value for stride

What’s wrong? Any idea?



Hi, Dane,

Any idea about what’s wrong with NVIDIA Hardware Acceleration in the WebRTC Framework?


error messages popped up:
(peerconnection_client:26719): Gtk-WARNING **: 00:38:29.888: drawing failure for widget ‘GtkWindow’: invalid value for stride



We have some examples in README. Is the issue hit in running video_loopback, peerconnection_client/server, or modules_tests?

Hi, Dane,

Yes. I followed the instructions in the README file and ran all the bash scripts.

All scripts worked except the peerconnection_client.

Had the peerconnection_server running on a TX2 on the same WiFi, and two peerconnection_clients running on two Nano end points (one each).

The error message observed on both Nanos was reported earlier.

I wonder if the problem can be reproduced in an NVIDIA lab?


CJ Liu

Hi CJLiu20152,

Are you connect USB camera on Jetson-Nano client?


$ ./peerconnection_server


$ ./peerconnection_client --autoconnect --server <Server.IP>

Client2: (connect USB camera on client 2 platfrom)

$ ./peerconnection_client --server <Server.IP> --autoconnect --autocall

The video from 1st client can be visible in 2nd.

Hi, Carol,

I tested with two Jetson Nanos as the WebRTC clients, each with an IMX219-120 8MP CSI camera.




Hello, Carol and Dane,

Does NVIDIA have any plan to release the NVIDIA Hardware Acceleration for the WebRTC Framework as an open source freeware?

Currently the software at https://developer.nvidia.com/embedded/dlc/r32-3-1_Release_v1.0/t186ref_release_aarch64/WebRTC_R32.3.1_aarch64.tbz2
contains C/C++ header and library files.



CJ Liu

WebRTC Framework is initially developed to enable USB camera to work with x86 PC/Laptop. We enable it on Jetson platforms for same usecase. CSI camera source may not be included in the scope.

For the request of open source, we have prebuilt libwebrtc.a, video_loopback, modules_tests, peerconnection_server, peerconnection_client in the package. Please share your usecase, the reason you cannot do the implementation, and which prebuilt binary should be open source.

Hi, Dane,

Thank you so much. Great info.

The CSI camera I was using has 4K resolution. Under the constraint you mentioned, no problem, I can switch to use 1080P USB webcams.

About my request for open sourcing the code, it was about the peerconnection_client, peerconnection_server, and video_loopback binary files. Looking for understanding their code design and implementation. They are not provided with associated documentation, which can definitely help learn better.

As to my use case, it is to leverage DeepStream SDK/framework and Jetson Nano for IOT IVA. My dissertation research is about “Restricted Boltzmann Machine Learning Method for Autonomous Video Telephony Quality Assessment” for helping Americans with hearing disability who rely on American Sign Language for communication.


CJ Liu

It would take some time in evaluating the open source. Also after the discussion, we may still keep the current release as binaries. If you consider to use DeepStream SDK, we would like to suggest you use RTSP. We have existing implementation:

Please also try to run the reference config files and check the document. The existing implementations should cover most usecases and you can check if you can get your usecase running by simply modifying the config file. This should speed up your development. The latest release is DS4.0.2

Hello DaneLLL,

I have a question related to the ones from CJLiu20152.

I would like to pilot my Jetbot from a distant web browser and for that I’d need a low latency video feed solution. The robot has a Raspberry camera V2 connected through CSI to the jetson nano.
To do that with a CSI camera connected to a Raspberry Pi based robot, I used the great UV4L WebRTC streaming server: https://www.linux-projects.org/uv4l/. Thanks to WebRTC, it allows to display the camera feed on the web browser with extremelly low latency (less than 80ms in a local network and less than 200ms from the internet @720p 15fps).

I guess this use case is pretty common, that’s why I think a good solution might already exist for the Jetbot.

If not, what would be the best way to achieve such a low latency with the Jetson Nano.

  1. Using WebRTC, but as you said only USB camera are supported on the Jetson Nano… Driver need to be modified? Is it planned?
  2. Custom implementation of gstreamer to WebRTC using GstWebRTC https://lazka.github.io/pgi-docs/GstWebRTC-1.0/index.html?
  3. Using DeepStream SDK to output a RTSP stream as you suggested and read the stream in the browser with something like https://github.com/Streamedian/html5_rtsp_player/wiki/HTML5-RTSP-Player, but in this case I am not sure about the latency we could hope for…

Thanks for your insights.

Hi jobesu14,

Please open a new topic for your issue. Thanks

ok, topic created here https://devtalk.nvidia.com/default/topic/1071093/jetson-nano/low-latency-jetbot-teleop-with-webrtc/.

Hi @mmilunovic,

I also want to build a WebRTC Server on my Jetson Nano for displaying a video stream I can get with GStreamer

Have you solved your problems and does it work properly, now?