Encoding the real time streaming from camera to h.264 and directing the stream towards other system

I am trying to get the video stream from a camera on jetson nano,encode it in h.264 and send the encoded stream to another system.Can anybody help me in doing this?I found that I can use GStreamer. I want to know how exactly it can be done and how much latency will be there while i encode and stream it to another?

We support gstreamer and tegra_multimedia_api software stacks. Please look at the documents:

The capability is listed in

The latency depends a lot on your application, network speed, destination machine, etc. Gstreamer is, however, a very low latency method. If you optimize your GStreamer pipeline with the correct compression settings it will likely be one of your fastest options.

Here’s an RTMP GStreamer pipeline that’s worked well for me. It’s currently video-only (no audio)

gst-launch-1.0 -v v4l2src device=/dev/video0 ! ‘video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1’ ! videoflip method=upper-right-diagonal ! videoconvert ! x264enc bitrate=2000 byte-stream=false key-int-max=60 bframes=0 aud=true tune=zerolatency ! h264parse ! flvmux name=mux audiotestsrc ! queue ! audioconvert ! voaacenc ! aacparse ! mux. mux. ! rtmpsink location=“RTMP URL”

I’m using the one above for streaming to Facebook Live, Vimeo, and YouTube.

There’s probably a few Nvidia-specific optimizations on top of this that could help. Check out the accelerated GStreamer guide that Dane posted.

Thanks for the reply.I will check that out.

Thank you.I will check that out and come back.

Hi,I have the same question,have you solved this problem?Also,in the received computer ,how can I decode the streaming.

Hi 838256401,

Please open a new topic for your issue. Thanks