Gstreamer udpsrc can't connect to YOLOv3 on the Jetson Nano

Hi guys,

May I ask how can I live stream UDP to a receiving computer to run Object Detection with YOLOv3?

This is the code I used to run Obj Det. :

./darknet detector demo cfg/coco.data cfg/yolov3_old.cfg weights/yolov3.weights -thresh 0.1 rtmp://login:pass@ip.address.of.server/live/drone

This is the code I used for UDP live streaming to the receiving computer :

gst-launch-1.0 -e nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)NV12, framerate=(fraction)60/1' ! nvv4l2h265enc maxperf-enable=1 bitrate=8000000 iframeinterval=40 preset-level=1 control-rate=1 ! h265parse ! rtph265pay config-interval=1 ! udpsink host= {IP OF RECEIVING COMPUTER} port=5000 sync=false async=false

How can I connect the live stream with my object detection?

P.S I can receive the live stream without any object detection using this code :

gst-launch-1.0 -vvv udpsrc port=5000 ! application/x-rtp,encoding-name=H265,payload=96 ! rtph265depay ! h265parse ! queue ! avdec_h265 ! autovideosink sync=false async=false -e

Hi,
We would suggest use DeepStream SDK. We have samples for Yolo. Please check

deepstream_sdk_v4.0.1_jetson\sources\objectDetector_Yolo\README

And a general usecase is to run RTSP. Please refer to
https://devtalk.nvidia.com/default/topic/1058086/deepstream-sdk/how-to-run-rtp-camera-in-deepstream-on-nano/post/5366807/#5366807

You can follow README to try default samples, launch your nvarguscamerasrc in test-lauch, and modify deepstream_app_config_yoloV3.txt to enable rtspsrc.

Hi DaneLLL,
I downloaded gst rtsp server an ran it following the steps in FAQ. This is the code I used :

./test-launch 'gst-launch-1.0 rtspsrc location=10.50.14.3:8553 ! rtph264depay ! queue ! h264parse ! nvv4l2decoder ! nvvideoconvert ! "video/x-raw(memory:NVMM),format=RGBA" ! nvegltransform ! nveglglessink sync=False'

But when I tried to connect using vlc player at my PC, I get this error

./test-launch 'gst-launch-1.0 rtspsrc location=10.50.14.3:8554 ! rtph264depay ! queue ! h264parse ! nvv4l2decoder ! nvvideoconvert ! "video/x-raw(memory:NVMM),format=RGBA" ! nvegltransform ! nveglglessink sync=False'
stream ready at rtsp://127.0.0.1:8554/test

(test-launch:17153): GStreamer-CRITICAL **: 13:00:14.270: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed

(test-launch:17153): GStreamer-CRITICAL **: 13:00:15.154: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed

(test-launch:17153): GStreamer-CRITICAL **: 13:00:15.446: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed

(test-launch:17153): GStreamer-CRITICAL **: 13:00:15.584: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed

(test-launch:17153): GStreamer-CRITICAL **: 13:00:15.706: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed

(test-launch:17153): GStreamer-CRITICAL **: 13:00:16.066: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed

(test-launch:17153): GStreamer-CRITICAL **: 13:00:16.198: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed

(test-launch:17153): GStreamer-CRITICAL **: 13:00:16.327: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed

How can I rectify this? Thanks!

Hi,
Please share more information about your usecase. You use one Nano running nvarguscamerasrc as RTSP server, and the other Nano receives RTSP stream to decode and perform yoloV3?

Right now, we’re not running object detection on the nano, just pure live streaming first. We wanted to make sure the live stream with RTSP works before we run yoloV3 on it.

Yes Currently, we’re using One Jetson Nano to run the Rtsp test-launch server an an ubuntu computer to receive the live stream.

Hi,
We also have DeepStream SDK for x86 PC. If your ubuntu computer is with NVIDIA GPUs, you may also install the package and try.
https://devtalk.nvidia.com/default/topic/1063437/deepstream-sdk/announcing-deepstream-sdk-4-0-1/