Trying to use detectnet on mjpeg source

Hi,

I have a webcam that produces mjpeg output. I would like to use this camera with detectnet to recognize if a person is coming into an area where they are not supposed to be.
I managed to cobble together an gst-launch queue that reads and parses the source and i though of sending this to an rtp input. However running detectnet rtp://@:1234 does not seem to receive anything. Any clue what I might be doing wrong?

call for the sending gst-launch is:

gst-launch-1.0 -v souphttpsrc location=http://192.168.1.1/snapshot.cgi user-id=user user-pw=password do-timestamp=true ! multipartdemux ! image/jpeg,width=1280,height=960 ! jpegdec ! videoconvert ! x264enc tune=zerolatency bitrate=90000 speed-preset=superfast ! rtph264pay ! udpsink host=jetson port=1234
the sending host is not the jetson
since i have no display on the jetson i tried to forward detectnet’s output somewhere else via rtp, so i called detectnet with:
detectnet --input-codec=h264 rtp://@:1234 rtp://destinationhost:1234

however detectnet just gives me ‘failed to capture video frame’ messages but way to few to indicate it is receiving something. According to tcpdump packets are sent.

If there is a simpler way to get the data from the webcam please do tell :)

Kind Regards,

Konstantin

Hi @konstantin3, is your webcam connected over USB? If so, you can try running it directly with detectnet program (it supports mjpeg). You may want to try the video-viewer utility first.

If your webcam is not USB, does it natively support RTP/RTSP? If so, detectnet program can also do those (without needing the intermediary PC pipeline).

If not, I take it that your gst-launch pipeline on your PC is taking that camera’s cgi web interface, and converting it to RTP. Before you try viewing the RTP feed on your Jetson, are you able to view the RTP feed from your PC? It would be good to test that part of the pipeline is working first.

If you run ifconfig from your Jetson, can you see recieve (rx) network traffic increasing?

No it is an IP-Cam sorry for the confusion.
It only offers the cgi mjpeg interface. And yes the GST-Pipe does work if i set up an audiovideosink i even used it and read on the jetson, let omxh264 reencode it and send it to the machine I am sittting in front of. Might it be, that I need to tune some bitrate parameters etc in sending to detecnet?

OK, gotcha. Can you try running this video-viewer command and post the console output?

video-viewer --input-codec=h264 rtp://@:1234 dump.mp4

Sometimes it can take some seconds for the RTP stream to start being received, but my guess is you have waited already.

I will paste it as it is not too long:

gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstDecoder -- creating decoder for 127.0.0.1
ESC[0;33m[gstreamer] gstDecoder -- resource discovery not supported for RTP streams
ESC[0m[gstreamer] gstDecoder -- pipeline string:
[gstreamer] udpsrc port=1234 multicast-group=127.0.0.1 auto-multicast=true caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264" ! rtph264depay ! h264parse ! omxh264dec ! video/x-raw ! appsink name=mysink
ESC[0;32m[video]  created gstDecoder from rtp://@:1234
ESC[0m------------------------------------------------
gstDecoder video options:
------------------------------------------------
  -- URI: rtp://@:1234
     - protocol:  rtp
     - location:  127.0.0.1
     - port:      1234
  -- deviceType: ip
  -- ioType:     input
  -- codec:      h264
  -- width:      0
  -- height:     0
  -- frameRate:  0.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
------------------------------------------------
ESC[0;33m[gstreamer] gstEncoder -- codec not specified, defaulting to H.264
ESC[0m[gstreamer] gstEncoder -- pipeline launch string:
[gstreamer] appsrc name=mysource is-live=true do-timestamp=true format=3 ! omxh264enc bitrate=4000000 ! video/x-h264 !  h264parse ! qtmux ! filesink location=dump.mp4
ESC[0;32m[video]  created gstEncoder from file:///home/elwood/dump.mp4
ESC[0m------------------------------------------------
gstEncoder video options:
------------------------------------------------
  -- URI: file:///home/elwood/dump.mp4
     - extension: mp4
  -- deviceType: file
  -- ioType:     output
  -- codec:      h264
  -- width:      0
  -- height:     0
  -- frameRate:  30.000000
  -- bitRate:    4000000
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
------------------------------------------------
ESC[0;31m[OpenGL] failed to open X11 server connection.
ESC[0mESC[0;31m[OpenGL] failed to create X11 Window.
ESC[0m[gstreamer] opening gstDecoder for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from NULL to READY ==> h264parse0
[gstreamer] gstreamer changed state from NULL to READY ==> rtph264depay0
[gstreamer] gstreamer changed state from NULL to READY ==> udpsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer changed state from READY to PAUSED ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from READY to PAUSED ==> h264parse0
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtph264depay0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> udpsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> omxh264dec-omxh264dec0
gstreamer] gstreamer changed state from PAUSED to PLAYING ==> h264parse0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtph264depay0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> udpsrc0
[gstreamer] gstreamer message stream-start ==> pipeline0
ESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame
ESC[0mESC[0;31mvideo-viewer:  failed to capture video frame

the ‘failed to capture’ are continuing

OK thanks - can you try running this other test pipeline from your PC, and check to see if your Jetson can recieve/display it ok through the video-viewer app again?

# run from PC (after disabling your other PC pipeline)
$ gst-launch-1.0 -v videotestsrc ! video/x-raw,width=300,height=300,framerate=30/1 ! x264enc ! rtph264pay ! udpsink host=jetson port=1234

If that test pipeline works, I would try changing a couple things in your camera pipeline to see if they make a difference:

  1. Change the resolution from 1280x960 to 1280x720
  2. Remove tune=zerolatency bitrate=90000 speed-preset=superfast

How shall I receive on the jetson videoviewer? If yes then not getting better.
I also tried to send the stream to a gst-udpsrc receiver on the same host also nothing.
I have another sender that kind of works:
gst-launch-1.0 -v ximagesrc ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay ! udpsink host=localhost port=1234

receives with gst-launch-1.0 -vvv udpsrc port=1234 caps = “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96” ! rtph264depay ! decodebin ! videoconvert ! autovideosink

video-viewer ignores it

I do think playing with GST_DEBUG that the reason is somewhere in the rtph264 repay module.

Maybe try changing to port 5000 in case there is some system permission/access issue on either end with port 1234.

Try with this simpler pipeline from your PC first, and then video-viewer to view it:

# from your PC
gst-launch-1.0 -v videotestsrc ! video/x-raw,width=300,height=300,framerate=30/1 ! x264enc ! rtph264pay ! udpsink host=jetson port=5000

# from your Jetson
$ video-viewer --input-codec=h264 rtp://@:5000

Same Result no reaction on the jetson side. In the final versiony the jetson should actually not be the receiver via rtp. The jetson can collect the data via gst from the ip-cam itself. The question is: what is the best way to get this into detectnet (preferably from python since in the end I want to create a script that if detectnet discovers a person, sends an email).

Regards,

Konstantin

To use a custom GStreamer pipeline such as yours, it might be easiest to use cv2.VideoCapture interface with your GStreamer pipeline, and then convert the OpenCV image to use with detectnet.py like in this sample that uses the jetson.utils.cudaFromNumpy() function:

The first step would be to test that cv2.VideoCapture can get frames from your custom GStreamer pipeline from your IP camera.

how do I put in the custom gstreamer pipeline in this example. Is there a python equivalent of the commmandline call I am doing?

Hi,

I managed to get it working in the way i was thinking it should. The trick was to add:
! ‘video/x-h264, stream-format=(string)byte-stream’ !
after the omxh264enc.
Now I have a final issue: the quality is very low (the source does deliver higher resolution) how do I tell omxh264enc to produce better quality pictures? I tried converting other videos using hw-accellerated gstreamer as well and either there was nearly no compression or the quality was serverely downgraded. Compared to converting video sources with ffmpeg I am wondering what are the missing parameters.

However it would be great if I can use the gstreamer queue directly as input in my python script instead of running two processes.

Hi @konstantin3, you can set the bitrate attribute of omxh264enc element (for more info, run gst-inspect-1.0 omxh264enc)

 control-rate        : Bitrate control method
                        flags: readable, writable, changeable only in NULL or READY state
                        Enum "GstOMXVideoEncControlRate" Default: 1, "variable"
                           (0): disable          - Disable
                           (1): variable         - Variable
                           (2): constant         - Constant
                           (3): variable-skip-frames - Variable Skip Frames
                           (4): constant-skip-frames - Constant Skip Frames
  bitrate             : Target bitrate
                        flags: readable, writable, changeable in NULL, READY, PAUSED or PLAYING state
                        Unsigned Integer. Range: 0 - 4294967295 Default: 4000000
  peak-bitrate        : Peak bitrate in variable control-rate
                         The value must be >= bitrate
                         (1.2*bitrate) is set by default(Default: 0)
                        flags: readable, writable, changeable in NULL, READY, PAUSED or PLAYING state
                        Unsigned Integer. Range: 0 - 4294967295 Default: 0

I believe the default is 4mbps VBR.