ffmpeg rtmp streaming low fps

Hi all,

I got a usb camera connected to my Jetson TX2 and I’m using opencv to read frames from the usb camera and then push the frames to a remote server using ffmpeg and rmtp protocol. The codes works and I can read the rtmp stream on another desktop. But when reading the stream, there is a big lag and the fps is so low (metadata says fps is 5). I tried to execute the same code on my desktop and then check the result, the result is good, high fps with a neglectable lag. When pushing the stream, TX2 uses only 1/4 of the bitrates compared to the desktop.

This is the code for pushing the stream:

import cv2
import subprocess as sp

rtmpUrl = 'rtmp://xxxxxxxxxxxxxxxxxx'

cap = cv2.VideoCapture(1)
if cap.isOpened():
    print('camera opened')
else:
    print('Fail to open camera')

cap.set(3, 640)
cap.set(4, 480)

sizeStr = str(640) + 'x' + str(480)
fps = int(cap.get(cv2.CAP_PROP_FPS))

command = ['ffmpeg',
    '-y',
    '-an',
    '-f', 'rawvideo',
    '-vcodec','rawvideo',
    '-pix_fmt', 'bgr24',
    '-s', sizeStr,
    '-r', str(fps),
    '-i', '-',
    '-c:v', 'libx264',
    '-pix_fmt', 'yuv420p',
    '-preset', 'ultrafast',
    '-f', 'flv',
    rtmpUrl]

pipe = sp.Popen(command, stdin=sp.PIPE)

while True:
    ret, frame = cap.read()
    if not ret:
        break
    else:
        pipe.stdin.write(frame.tostring())

And the terminal command for viewing the stream:

ffplay -fflags nobuffer rtmp://xxxxxxxxxxxxxxxxxxxxxxxx

Any suggestions?

Hi,
ffmpeg is software decoding. You may execute ‘sudo jetson_clocks’ to run CPU at max clocks.

Hardware acceleration is supported in gstreamer and tegra_multimedia_api. Suggest you switch to use hardware decoding.
https://developer.nvidia.com/embedded/dlc/l4t-multimedia-api-reference-32-1

You may refer to below posts:
[url]https://devtalk.nvidia.com/default/topic/1048291/jetson-tx2/gstreamer-pipeline-framerate-degrading-over-time/post/5320275/#5320275[/url]
[url]https://devtalk.nvidia.com/default/topic/1023943/[/url]

Thank you for your quick reply,I’ll take a look.

Hi DaneLLL,

I have read about gstreamer and multimedia api, and could not find a way to first get frames using opencv and then streaming over rtmp. And my cpu usage is around 60%, which is not very high. Gpu useage is always high because there are two deep learning models. So I’d still prefer to use ffmpeg.

I currently can stream over rtmp at 30fps using the following bit of code:

def show(self):
        global frame
        cap = cv2.VideoCapture(1)
        cap.set(3, 640)
        cap.set(4, 480)

        command = ['ffmpeg',
                   '-an',
                   '-f', 'rawvideo',
                   '-vcodec', 'rawvideo',
                   '-pix_fmt', 'bgr24',
                   '-s', '300x300',
                   '-i', '-',
                   '-preset', 'ultrafast',
                   '-f', 'flv',
                   '-q', '0',
                   self.rtmpUrl]

        pipe = sp.Popen(command, stdin=sp.PIPE)
        while True:
            rett, img = cap.read()
            if rett is False:
                print('camera failed')
                break
            frame = img[90:390, 170:470, :]
            pipe.stdin.write(frame.tostring())

This works if this is the only threading running. I also tried to simultaneously streaming from 2 cameras using this code, both stream can hold steady at 30fps without any problem. However, when I put other functionalities together, the stream will start at 30fps and quickly drop to around 15fps and hold at this rate. While streaming at 15fps, I found out that

pipe.stdin.write()

command is taking up extra time. Execute

sudo /usr/bin/jetson_clocks

does not improve the fps.

I first suspect the use of global variable limit the fps, but then I test the exact same codes on my desktop and it can stream at steady 30fps with all the other functionalities running. When running all the functionalities on TX2, cpu usage is around 60%, so I would assume that cpu is not limiting the performance.

Any idea what might cause this weird behavior?

Many thanks!