I’m following along with the instructions at Hardware Acceleration in the WebRTC Framework — Jetson Linux Developer Guide documentation on how to use WebRTC. It instructs me to read the " README file in the WebRTC_r34.1_aarch64.tbz2
package contains additional information about for application usage and setup." However, I can’t find this file anywhere. I’ve managed to find the R32.6.1 release. Can I get some help finding 34.1, or does it even matter?
My end goals is to stream multiple cameras simultaneously to a webview, and send data back through that same web view from keyboard inputs.
Looks like there is webRTC package in r34/r35 release package.
Please try the r32 on it.
Also can reference to below topic.
<img src="https://github.com/dusty-nv/jetson-inference/raw/master/docs/images/deep-vision-header.jpg" width="100%">
<p align="right"><sup><a href="pytorch-collect-detection.md">Back</a> | <a href="webrtc-html.md">Next</a> | </sup><a href="../README.md#hello-ai-world"><sup>Contents</sup></a>
<br/>
<sup>WebApp Frameworks</sup></s></p>
# WebRTC Server
jetson-inference includes an integrated WebRTC server for streaming low-latency live video to/from web browsers that can be used for building dynamic web applications and data visualization tools powered by Jetson and edge AI on the backend. WebRTC works seamlessly with DNN inferencing pipelines via the [`videoSource/videoOutput`](aux-streaming.md#source-code) interfaces from jetson-utils, which utilizes hardware-accelerated video encoding and decoding through GStreamer. It supports sending and receiving multiple streams to/from multiple clients simultaneously, and includes a built-in webserver for viewing video streams remotely without needing to build your own frontend:
<img src="https://github.com/dusty-nv/jetson-inference/raw/master/docs/images/webrtc-builtin.jpg" width="600">
In this screenshot of full-duplex mode, the webcam from a laptop is being streamed to a Jetson over WebRTC, where the Jetson decodes it and performs object detection using detectNet, before re-encoding the output and sending it back to the browser again via WebRTC for playback. The round-trip latency goes largely unnoticed from an interactivity standpoint over local wireless networks. On the client side, it's been tested with multiple browsers including Chrome/Chromium, mobile Android, and mobile iOS (Safari) using H.264 compression.
``` mermaid
graph LR
camera([fa:fa-video-camera Camera])
player([fa:fa-television Browser])
subgraph server ["Jetson (Edge Server)"]
decoder["Decoder"]
inference["Inference"]
This file has been truncated. show original
system
Closed
June 28, 2023, 2:43am
6
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.