I want to stream a live camerafeed to a RTMP server using gstreamer on my TX2. Also, I need to receive video from a RTMP server and use it as input in an app (darknet) using appsink with gstreamer. Both with the lowest possible latency.
Sending video to RTMP
This pipe works, but there is a delay on multiple seconds:
gst-launch-1.0 -e nvcamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720,format=(string)I420, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw, format=(string)I420' ! queue ! x264enc ! flvmux streamable=true ! queue ! rtmpsink location='rtmp://<host adress>/live live=true'
From here I tried to advance from the previous pipe and run it through darknet while spliting the video using a tee and send it to a RTMP server. Therefore, I came up with the following pipe, but it doesn’t work…:
./darknet detector demo data/obj.data cfg/yolov3-tiny-obj.cfg yolov3-tiny-obj.weights "nvcamerasrc ! video/x-raw(memory:NVMM), width=640, height=480, format=(string)I420, framerate=24/1 ! nvvidconv ! video/x-raw,format=(string)I420 ! tee name=t t. ! videoconvert ! video/x-raw,format=BGR ! appsink t. ! queue ! x264enc ! flvmux streamable=true ! queue ! rtmpsink location='rtmp://<host adress>/live live=true'"
Receive video from RTMP
Using the following pipeline I can input a videofeed from RTMP into darknet. The problem is that I have a continously increasing delay on the inputstream.
./darknet detector demo data/obj.data cfg/yolov3-tiny-obj.cfg yolov3-tiny-obj.weights rtmp://<host adress>/live/stream
I also tried this but it doesn’t work either…:
./darknet detector demo data/obj.data cfg/yolov3-tiny-obj.cfg yolov3-tiny-obj.weights "rtmpsrc location='rtmp://<host adress>/live live=true' ! flvdemux ! video/x-raw(memory:NVMM) ! omxh264dec ! 'video/x-raw, format=(string)I420' ! videoconvert ! 'video/x-raw, format=(string)BGR' ! appsink"
If anybody knows any other working solutions with low latency this to would be deeply appreciated.