GStreamer Network Video Stream and Save to File

I am trying to use the following project https://www.jetsonhacks.com/2014/10/16/gstreamer-xvimage/ as a template to use the Jetson Nano to stream two USB webcams (C920) to the screen. However, when I use the following code:

gst-launch-1.0 -vvv \
	tee name=splitter \
	v4l2src device=/dev/video0 do-timestamp=true ! image/jpeg, width=1280, height=720, framerate=30/1 \
   	! jpegparse ! jpegdec \
   	! $videoconvert ! videoscale ! xvimagesink sync=false splitter. \
	v4l2src device=/dev/video1 do-timestamp=true ! image/jpeg, width=1280, height=720, framerate=30/1 \
   	! jpegparse ! jpegdec \
   	! $videoconvert ! videoscale ! xvimagesink sync=false splitter.

I get an error:

WARNING: erroneous pipeline: unexpected reference "splitter" - ignoring

I can use the following code to view a single webcam for both video inputs no problem:

gst-launch-1.0 -v v4l2src device=/dev/video0 \
   	! image/jpeg, width=1920, height=1080, framerate=30/1 \
   	! jpegparse ! jpegdec \
   	! videoconvert ! videoscale \
   	! xvimagesink sync=false

I am wondering if it is a syntax issue I am having. Eventually, I would like to save one of these videos to a file and stream via RTSP. But I am having issues using the “tee” functionality of gstreamer.

Thank you!

Hi,
tee plugin is for single source and multiple sinks. Your case is multiple sources and single sink. You may use nvcompositor plugin. Please refer to
https://devtalk.nvidia.com/default/topic/1071764/jetson-nano/nvcompositor-appears-to-only-render-lowest-8-zorder-streams/post/5429939/#5429939

Thank you for the quick response. Can you please provide information on how to use the tee plugin correctly? For a similar application, I would like to have a single source with multiple sinks. I would like to take a single USB webcam input, save the video to a file, and stream the video via RTSP. The example below has the same error as my first post:

gst-launch-1.0 -vvv -e \
        mp4mux name=mux ! filesink location=file.mp4 \
        v4l2src device=/dev/video0 timestamp=true \
        ! video/x-h264, width=1920, height=1080, framerate=30/1
        ! tee name=tsplit \
        ! queue ! h264parse ! omxh264dec ! videoconvert ! videoscale \
        ! video/x-raw, width=1280, height=720 ! xvimagesink sync=false tsplit. \
        ! queue ! h264parse ! mux.video_0 tsplit. \
        ! queue ! h264parse ! mpegtsmux ! tcpserversink host=$IP_ADDRESS port=5000 \
        pulsesrc device=alsa_input.usb-046d_HD_Pro_Webcam_C920_A116B66F-02-C920.analog-stereo do-timestamp=true \
        ! audio/x-raw ! queue ! audioconvert ! voaacenc ! queue ! mux.audio_0

Thank you!

Hi,
Please refer to
https://devtalk.nvidia.com/default/topic/976743/jetson-tx1/get-rgb-frame-data-from-nvvidconv-gstreamer-1-0/post/5022878/#5022878
and
https://devtalk.nvidia.com/default/topic/1057681/jetson-tx1/logitech-c930e-on-jetson-tx1-very-slow-and-choppy-video/post/5363417/#5363417