I have developed an application for object detection and have succesfully used it with local files. I need to use this application now to do object detection on an RTSP stream. I can not get it to work (my application completely hangs on videosource.capture()).
I have tried the video-viewer.py example script which has nice verbose output and I got the following:
[gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstDecoder -- creating decoder for 127.0.0.1
[gstreamer] gstDecoder -- Could not write to resource.
[gstreamer] gstDecoder -- pipeline string:
[gstreamer] rtspsrc location=rtsp://127.0.0.1:9554/testcamera ! queue ! rtph264depay ! h264parse ! omxh264dec ! video/x-raw ! appsink name=mysink
[video] created gstDecoder from rtsp://127.0.0.1:9554/testcamera
------------------------------------------------
gstDecoder video options:
------------------------------------------------
-- URI: rtsp://127.0.0.1:9554/testcamera
- protocol: rtsp
- location: 127.0.0.1
- port: 9554
-- deviceType: ip
-- ioType: input
-- codec: h264
-- width: 0
-- height: 0
-- frameRate: 0.000000
-- bitRate: 0
-- numBuffers: 4
-- zeroCopy: true
-- flipMethod: none
-- loop: 0
------------------------------------------------
[OpenGL] glDisplay -- X screen 0 resolution: 2560x1440
[OpenGL] glDisplay -- X window resolution: 2560x1440
[OpenGL] glDisplay -- display device initialized (2560x1440)
[video] created glDisplay from display://0
------------------------------------------------
glDisplay video options:
------------------------------------------------
-- URI: display://0
- protocol: display
- location: 0
-- deviceType: display
-- ioType: output
-- codec: raw
-- width: 2560
-- height: 1440
-- frameRate: 0.000000
-- bitRate: 0
-- numBuffers: 4
-- zeroCopy: true
-- flipMethod: none
-- loop: 0
------------------------------------------------
[gstreamer] opening gstDecoder for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from NULL to READY ==> h264parse0
[gstreamer] gstreamer changed state from NULL to READY ==> rtph264depay0
[gstreamer] gstreamer changed state from NULL to READY ==> queue0
[gstreamer] gstreamer changed state from NULL to READY ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer changed state from READY to PAUSED ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from READY to PAUSED ==> h264parse0
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtph264depay0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> queue0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> h264parse0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtph264depay0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> queue0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer rtspsrc0 ERROR Could not write to resource.
[gstreamer] gstreamer Debugging info: gstrtspsrc.c(7023): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Error (400): Bad Request
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message warning ==> rtspsrc0
I did need to suply --input-codec=h264 to get it create the videosource.
If I test the same RTSP stream with ffmpeg:
ffplay -rtsp_transport tcp -i rtsp://127.0.0.1:9554/testcamera
it works fine. But as you can see I do need to force the transport to TCP. UDP is default and that won’t work.
According to the jetson-inference library documentation the videosource object also assumes UDP. But when I check the documentation of the underlying gstreamer library it says it will try UDP first but will use TCP if UDP fails to connect.
I’m having a feeling that I somehow need it to force to use TCP, but how? Or might it be some other problem?