*I am running this on Jetson NX and the deepstream version is 6.1.
*I want to know how can i get rtsp streaming with USB camera.
*I used python3 deepstream-test1-rtspout.py -i /dev/video0 to run
I got the following error and I am attaching the modified file also please kindly help me thank you. rtsp-out.py (14.2 KB)
Creating Pipeline
Creating Source
Creating Decoder
creating the video converter
Creating Decoder
Creating H264 Encoder
Creating H264 rtppay
Playing file /dev/video0
Adding elements to Pipeline
Linking elements in the Pipeline
Linking element of source and sink1
Traceback (most recent call last):
File “deepstream_test1_rtsp_out.py”, line 387, in
sys.exit(main(sys.argv))
File “deepstream_test1_rtsp_out.py”, line 307, in main
srcpad.link(sinkpad)
File “/usr/lib/python3/dist-packages/gi/overrides/Gst.py”, line 178, in link
raise LinkError(ret)
I tried without decoding also not working does my source elements linking and adding elements to pipeline are correct? because i am confusing at these points
Hello thanks for your reply , the method you mentioned above is able start and get streaming for 3 to 5 seconds and getting crashed, But I have solved that problem by adding decoder. Now I am able to get the usb camera live streaming and push to rtsp server but when I use deepstream-test3 to view rtsp streaming and apply inference the streaming is getting struck at the first frame itself could you please guide me how to overcome that problem .
Hi thanks for your suggestion its working fine but I have some problem
if i use client and server both on jetson xavier is working fine and streaming also fine.
if i use my localPC(dgpu) to capture the stream and push the stream to rtsp and if i try to view the rtsp streaming on jetson i am unable to view the streaming, but able to view streaming if i use my local PC as server.
Following is the error message I am getting on jetson xavier please kindly help me to sort out this issue.
nvidia@nvidia-desktop:~$ gst-launch-1.0 -v playbin uri=rtsp://user:pass@10.1.2.139:8554/rtsp/camera uridecodebin0::source::latency=300
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: ring-buffer-max-size = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-size = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-duration = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: use-buffering = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: download = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: uri = rtsp://user:pass@10.1.2.139:8554/rtsp/camera
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: connection-speed = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source: latency = 300
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: source = “(GstRTSPSrc)\ source”
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://user:pass@10.1.2.139:8554/rtsp/camera
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7893): gst_rtspsrc_retrieve_sdp (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source:
Failed to connect. (Generic error)
ERROR: pipeline doesn’t want to preroll.
Setting pipeline to NULL …
Freeing pipeline …
Hello AmyCao sorry for the late reply I have checked the firewall and there is no any issue I can view the streaming with playbin command but if i use the deepstream app unable to view the streaming.
I have tried with tow different pc’s but still unable to view the streaming on jetson.
Hello AmyCao actually there is no any firewall issue I am capturing the video from USB camera which is connected to local PC in the following I am attaching the code that I used for capturing and pushing to rtsp server could you please tell me is there any problem. How ever if i use the same code by connecting the USB camera to jetson the rtsp streaming is working without any problem.Thanks in advance. rtsp-1.py (5.5 KB)
Thanks for your reply , But there is no any network issue because the gst-launch command line is working. On the server side there will be only Gstreamer we need to capture the video , encode to H264 format and push the stream to rtsp server so that we can view the streaming and apply inference on jetson .
if i use the above the command then able the view the streaming on jetson, I have converted the server command into python code and captured the streaming but on jetson deepastream app unable to view the streaming. hope you understand the problem clearly now.
from your reply I knew that you are not are not at all reading the problem above, on server side there will be no deepstream so cant use nvv4l2h264enc and nvvideoconvert. I am using x264enc but its not working hope at least you will read the problem clearly this time. I know you have provide the example rtsp out but for that we need to install deepstream.
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one.
Thanks
The udpsink and rtsp server part in the sample code has nothing to do with DeepStream, you can use the same code with your “x264enc". The purpose of the sample is to show the usage of the APIs.