Jetson Nano - CSI RPi.IMX219 to localhost stream

Probably by far gstreamer framework would be the easiest way.

Splitting your case into several cases:

1. Get images from CSI camera

Assuming your camera is a RPi v2 IMX219 camera, this sensor produces Bayer RG10 format frames. Although these bayer frames are available from V4L interface, you need to debayer these before going further.
Doing that from CPU would not be reasonable on Nano. So your best option for debayering would be Argus using dedicated ISP for producing YUV NV12 frames into NVMM memory (DMA-able memory suitable for GPU, HW enc/dec, VIC…). Plugin nvarguscamerasrc provides management of camera and provides frames (may be also able to rescale in some extent). Here using a fake sink to the pipeline doing nothing, it should report running until you interrupt with Ctrl-C:

gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12,width=1280,height=720,framerate=30/1' ! fakesink

2. Display your camera

gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12,width=1280,height=720,framerate=30/1' ! nvvidconv ! autovideosink

You would use plugin nvvidconv for copying to/from system memory, or converting video formats, rescaling, cropping… see:

gst-inspect-1.0 nvvidconv

3. Encode and packetize for the readers

You would select an encoding and a network protocol/container.
This mainly depends on your needs (compression vs bandwidth), network (wired ethernet, wifi, else…), hosts reading the stream (what OS, what software available ?).

The following pipeline would stream your camera feed, encoded into H264 stream, then put into mpeg transport stream format, then streamed as RTP/MP2T that has a static payload (33) so it can be easily read from VLC or FFMPG without an SDP File. Note that RTP mostly implies UDP transport.

gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12,width=1280,height=720,framerate=30/1' ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! h264parse ! mpegtsmux ! rtpmp2tpay ! fakesink

If this runs, you may try to stream further ;-)

4. Stream to localhost for checking

You would stream to localhost with:

gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12,width=1280,height=720,framerate=30/1' ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=127.0.0.1 port=5004 auto-multicast=0

and check from another terminal with:

# Using gstreamer:
gst-launch-1.0 -v udpsrc port=5004 ! application/x-rtp,media=video,encoding-name=MP2T,clock-rate=90000, payload=33 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! autovideosink

# Or using VLC (may have to check https://forums.developer.nvidia.com/t/vlc-media-player-crashes/154612/4?u=honey_patouceul  if experiencing crash)
cvlc rtp://127.0.0.1:5004

# Or using FFMPEG (may take 10s to setup):
ffmpeg -i udp://127.0.0.1:5004 -f xv display
ffmpeg -i rtp://127.0.0.1:5004 -f xv display

5. Stream over network

You may stream RTP over UDP to some hosts only or multicast to all the LAN. If your network is WiFi, better avoid multicast and stream to selected host.

# Stream to multicast wired LAN on port 5004:
gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12,width=1280,height=720,framerate=30/1' ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=224.1.1.1 port=5004 auto-multicast=1

# Stream to selected host 192.168.1.12 on port 5004 on WiFi network:
gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM),format=NV12,width=1280,height=720,framerate=30/1' ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=192.168.1.12 port=5004 auto-multicast=0 

You would receive on host with:

# For wired LAN multicast case: 
gst-launch-1.0 -v udpsrc port=5004 auto-multicast=1 address=224.1.1.1 ! application/x-rtp,media=video,encoding-name=MP2T,clock-rate=90000, payload=33 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! h264parse ! avdec_h264 ! videoconvert ! autovideosink

# For WiFi selected host:
gst-launch-1.0 -v udpsrc port=5004 auto-multicast=0 ! application/x-rtp,media=video,encoding-name=MP2T,clock-rate=90000, payload=33 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! h264parse ! avdec_h264 ! videoconvert ! autovideosink

6. Explore other protocols such as RSTP

RTSP is an application layer protocol managing SDPs for dynamic payloads.
Start with :
Q: Is there any example of running RTSP streaming?
in https://forums.developer.nvidia.com/t/jetson-nano-faq/

Last thing, this was just my best advice but sorry I am unable to test these for now, there may be errors in my post. Hope it is a bit clearer now, though.

1 Like