Gstreamer issue streaming webcam from TX2 over UDP

I am trying to stream a webcam from the Jetson to another machine over UDP. The scrips don’t show any errors but the client appears to be hung up at “New clock: GstSystemClock”.

Server Code:

gst-launch-1.0 v4l2src device=/dev/video1 ! videoconvert ! 'video/x-raw, width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! omxh264enc control-rate=2 bitrate=2000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! rtph264pay mtu=1400 ! udpsink auto-multicast=true port=5000 sync=false

Client Code:

gst-launch-1.0 udpsrc port=5000 ! "application/x-rtp,media=video, clock-rate=90000, encoding-name=H264,payload=96" ! rtph264depay ! h264parse ! queue ! avdec_h264 ! xvimagesink sync=false

Server Output:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
===== MSENC blits (mode: 1) into tiled surfaces =====

Client Output:

autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

I’ve let it run for 10 minutes but the feed never opens. I also tried a similar pipeline with a TCP stream that worked, just with too much latency (500ms), hence why i’m switching to UDP.

The server is a Jetson TX2 L4T 28.2.1 with Jetpack 3.3

The client is a Windows Machine running a VMWare instance with Ubuntu 16.04.5

In server pipeline, you may try to add config-interval=1 into rtph264pay options.

I just tried that, there is no change.

Sorry, I’m away from my TX2 for testing… Which L4T are you running ?
I have had this use case working, but won’t be able to try further before one day.

you may also try to use nvvidconv into memory:NVMM if it supports conversion from your camera format instead of videoconvert.

Ok I changed “videoconvert” to “nvvidconv”. I also added (memoryNVMM) to “video/x-raw”. But still no change. I am currently looking to see if it could be a problem with running it in a VM.

Please let me know when you can access your Jetson.

Thanks!

I confirm this works on my TX2 with R28.2.0 and ZED camera.

TX2 server:

gst-launch-1.0 v4l2src device=/dev/video1 ! 'video/x-raw, format=YUY2' ! nvvidconv ! 'video/x-raw(memory:NVMM), width=640, height=480' ! omxh264enc ! h264parse ! rtph264pay config-interval=1 ! udpsink host=<HOST_IP> port=5000

Host client (native Ubuntu16.04):

gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp, encoding-name=H264, payload=96' ! rtph264depay ! h264parse ! avdec_h264 ! xvimagesink
1 Like

Thanks! I was finally able to get something running, just not with a VM client. I was only able to run it through my Windows machine, which I am guessing is because it is locally installed and not a virtual machine.

Thanks for your help!

1 Like

I am trying to add this pipeline to work with gscam ROS package, the ROS package expect the video conversion with a format of BGR to fill it in the ROS image messages. @Honey_Patouceul can we add this at the end of the transmitter pipeline, would it be before or after the h264 compression ? And inside Gscam config should the receiver pipeline be the same ?

Thanks