I have a question actually.
How does this package solve my problem?
How can I use it to display the RTSP stream in the browser or to build WebRTC from the GitHub repository I linked?
If your usecase is based on WebRTC, please follow it to enable hardware encoding.
Hardware decodign is not supported. If you would like to leverage hardware decoder, please use tegra_multimedia_api, or gstreamer framework.
I don’t need hardware encoding or decoding.
I have an RTSP stream (rtsp://192.168.17.88:8554/test) from one device, and I want to display that video in the browser from another device.
This repo GitHub - mpromonet/webrtc-streamer: WebRTC streamer for V4L2 capture devices, RTSP sources and Screen Capture does this perfectly,but I can’t build WebRTC (one of the dependencies) on the Jetson’s.
If I could do that, I would be able to run something like
./webrtc-streamer rtsp://192.168.17.88:8554/test
Navigate to localhost:8000 and see the video.
I still don’t understand how does webrtc package you provided helps me with this.
In the README there is a basic use case with peer connection, which I don’t think I need.
Hi,
I may not suggest properly. It enables hardware encoding in the package, and looks like what you need is hardware decoding. My apology for the wrong direction.
As of now we don’t have other enhancement for webrtc. Need other users to share experience.
I’m pretty new at NVIDIA Jetson Nano but I would like to build a WebRTC Server on my Jetson Nano.
Right now I have a Camera connected via Ethernet to my Jetson so that I can stream that video by using GStreamer. But now I want to set up WebRTC to capture that stream and display it in a browser.
Have you solved your problems and does it work properly, now? Can I use the webrtc-streamer from your linked repository for reaching my aims with WebRTC on the Jetson Nano.