Problems about the image resolution of rtsp stream

• Hardware Platform (Jetson / GPU) Jetson
• DeepStream Version6.1.1
Hi, I am trying to use a rtsp stream as input source of deepstream,but the image seems to be a little bit strange with some mosaic:


The normal image seems to be fine when I use cv2.imshow to display the image:

As you can see,it’s clearer.
By the way, the rtsp stream comes from the ffmpeg tool, and the color_image which displayed by the cv2.imshow is the input image of ffmpeg.
So maybe there must be some image quality loss when I transform it into rtsp stream? Or is there any way to improve this situation.

which sample are you testing? what is the media pipeline? could you use the latest DS7.1 to reproduce this issue? Thanks! if the issue remains on DS7.1, could you share the detailed reproducing steps?

I am testing deepstream-app sample.What do you mean media pipeline?
I may spend more time familiarizing myself with the DS7.1 codes,so if possible,I wish it can be solved in this version.

If using deepstraem-app, could you share the configuration file of deepstream-app? wondering what plugins are used in your application.
To reproduce this issue at my side, could you share how did you set up rtsp server? and please share the source video. Thanks!

The config file:
deepstream_app_config.txt (1.1 KB)
The rtsp server is based on ffmpeg:

ffmpeg_cmd = [
    'ffmpeg',
    '-y',  # 自动覆盖文件
    '-f', 'rawvideo',
    '-pix_fmt', 'bgr24',
    '-s', '640x480',
    '-r', '30',  # 帧率
    '-i', '-',  # 输入来自 stdin
    '-f', 'rtsp',
    'rtsp://127.0.0.1:8554/test.stream'  # RTSP 服务器地址
]
    # 启动 FFmpeg 子进程
    ffmpeg_process = subprocess.Popen(ffmpeg_cmd, stdin=subprocess.PIPE)
    try:
        while not rospy.is_shutdown():
            #  # Wait for a coherent pair of frames: depth and color 等待一对连贯的帧:深度和颜色
            intr, color_image, depth_image, aligned_depth_frame, fps_value = get_aligned_images()  # 获取对齐的图像与相机内参
            #相机内参、彩色图、深度图、齐帧中的depth帧,fps值
            if not depth_image.any() or not color_image.any():
                continue                             
            ffmpeg_process.stdin.write(color_image.tobytes())

The main codes are above, the color_image is from the camera’s SDK,so I can’t share the source video, it is just the image of the camera.

what is rtspserver software? ffserver? Based on your cfg, after testing with local file, I can’t reproduce this issue on Orin with DS7.1. please refer to the screenshot in vlc.zip.
deepstream_app_config_cus.txt (1.6 KB) vlc.zip (364.0 KB)

  1. could your the share the media information of rtsp by this cmd or other cmd?
ffprobe -i rtsp://127.0.0.1:8554/test.stream
  1. could you record some stream data this cmd? wondering if the data DeepStream got is fine.
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! rtph264depay ! h264parse !   mux. mpegtsmux name=mux ! filesink location=output.ts
  1. could you share the b.n12 after running the following cmd? wondering if it is a nvvideoconvert issue.
    c.jpeg.zip (195.2 KB)
 gst-launch-1.0   filesrc location=c.jpeg ! jpegdec  ! nvvideoconvert  ! 'video/x-raw(memory:NVMM), format=NV12', width=1920, height=1080 ! nvvideoconvert  ! 'video/x-raw(memory:NVMM), format=NV12', width=640, height=480  ! nvvideoconvert  ! 'video/x-raw' ! filesink location=b.nv12

I am using mediamtx as the rtsp server.
1.For command 1:

ffprobe version 4.2.7-0ubuntu0.1 Copyright (c) 2007-2022 the FFmpeg developers
built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/aarch64-linux-gnu --incdir=/usr/include/aarch64-linux-gnu --arch=arm64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
libavutil 56. 31.100 / 56. 31.100
libavcodec 58. 54.100 / 58. 54.100
libavformat 58. 29.100 / 58. 29.100
libavdevice 58. 8.100 / 58. 8.100
libavfilter 7. 57.100 / 7. 57.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 5.100 / 5. 5.100
libswresample 3. 5.100 / 3. 5.100
libpostproc 55. 5.100 / 55. 5.100
Input #0, rtsp, from ‘rtsp://127.0.0.1:8554/test.stream’:
Metadata:
title : No Name
Duration: N/A, start: 0.097389, bitrate: N/A
Stream #0:0: Video: mpeg4 (Simple Profile), yuv420p, 640x480 [SAR 1:1 DAR 4:3], 30 tbr, 90k tbn, 30 tbc

For command 2,I add .stream on the command:
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test.stream ! rtph264depay ! h264parse ! mux. mpegtsmux name=mux ! filesink location=output.ts
The output:

设置暂停管道 ...
管道正在使用且不需要 PREROLL ...
进度:(open)Opening Stream
进度:(connect)Connecting to rtsp://127.0.0.1:8554/test.stream
进度:(open)Retrieving server options
进度:(open)Retrieving media info
进度:(request)SETUP stream 0
进度:(open)Opened Stream
设置播放管道 ...
New clock: GstSystemClock
进度:(request)Sending PLAY request
进度:(request)Sending PLAY request
进度:(request)Sent PLAY request
警告:来自组件 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:延迟链接失败。
额外的调试信息:
./grammar.y(506): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
failed delayed linking some pad of GstRTSPSrc named rtspsrc0 to some pad of GstRtpH264Depay named rtph264depay0
错误:来自组件 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0:Internal data stream error.
额外的调试信息:
gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0:
streaming stopped, reason not-linked (-1)
Execution ended after 0:00:00.232187214
设置 NULL 管道 ...
释放管道资源 ...

And the output.ts seems to be nothing but blank space.
For command 3:

设置暂停管道 ...
管道正在 PREROLLING ...
nvbufsurface: NvBufSurfaceCopy: mem copy failed
nvbufsurface: NvBufSurfaceCopy: failed to copy
管道被 PREROLLED ...
设置播放管道 ...
New clock: GstSystemClock
收到来自组件“pipeline0”的 EOS 信号。
Execution ended after 0:00:00.000792309
设置 NULL 管道 ...
释放管道资源 ...

The file:
b.nv12.zip (632 Bytes)

from the log, the video format of rtsp source is mpeg4. what I understand is: rospy interface gets raw data, then ffmpeg send the raw data to mediamtx , then mediamtx encode raw data to mpeg4 stream. right?

  1. are you using cv2.imshow to play raw data, which is sent by ffmpeg to mediamtx? AYK, mpeg4 is lossy encoding, the image quality of encoded stream is worse that raw data. I suggest using h264 encoding, which has a better encoding performance.
    1. could you share the output.ts after running the following cmd? wondering if the data DeepStream got is fine.
ffmpeg -i rtsp://127.0.0.1:8554/test.stream  -c copy output.ts

Exactly,it has nothing to do with rospy, it’s just a loop condition,

Yes, I use cv2.imshow to play raw data, which is sent by ffmpeg to mediamtx.I will try to use h264 encoding,thanks

ffmpeg -i rtsp://127.0.0.1:8554/test.stream  -c copy output.ts

for your command:
output.ts.zip (174.0 KB)

Thanks for the update! I checked output.ts.zip. but I can’t see any video. maybe it is too short. can you provide a longer one? is there mosaic after running the following cmd? please share a screenshot. Thanks!

ffplay rtsp://127.0.0.1:8554/test.stream 

I make it longer:
output.ts.zip (1.1 MB)
For the screen shot:


There is mosaic.

Yes. the input data DeepStream received has bad image quality. it is the issue of rtsp encoding. you can use H264 encode type with high bitrate.

Thanks,I will try that later

Hi,I use the config:
‘-c:v’, ‘libx264’
but when I put the stream into deepstream,there is error:

NVMEDIA: NvMMLiteNVMEDIAProcessVES: 1824: NvMediaParserParse Unsupported Codec 
Stream format not found, dropping the frame
Stream format not found, dropping the frame
NVMEDIA: NvMMLiteNVMEDIADecDoWork: 2228: NVMEDIA Video Dec Unsupported Stream 
ERROR from nvv4l2decoder0: Failed to process frame.
Debug info: /dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/gstv4l2videodec.c(1815): gst_v4l2_video_dec_handle_frame (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstURIDecodeBin:src_elem/GstDecodeBin:decodebin0/nvv4l2decoder:nvv4l2decoder0:
Maybe be due to not enough memory or failing driver
ERROR from nvv4l2decoder0: Failed to process frame.
Debug info: /dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/gstv4l2videodec.c(1815): gst_v4l2_video_dec_handle_frame (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstURIDecodeBin:src_elem/GstDecodeBin:decodebin0/nvv4l2decoder:nvv4l2decoder0:
Maybe be due to not enough memory or failing driver
NVMEDIA: NvMMLiteNVMEDIAProcessVES: 1824: NvMediaParserParse Unsupported Codec 
NVMEDIA: NvMMLiteNVMEDIADecDoWork: 2228: NVMEDIA Video Dec Unsupported Stream 
NVMEDIA: NvMMLiteNVMEDIAProcessVES: 1824: NvMediaParserParse Unsupported Codec 
NVMEDIA: NvMMLiteNVMEDIAProcessVES: 1824: NvMediaParserParse Unsupported Codec 
Quitting
NVMEDIA: NvMMLiteNVMEDIADecDoWork: 2228: NVMEDIA Video Dec Unsupported Stream 
NVMEDIA: NvMMLiteNVMEDIAProcessVES: 1824: NvMediaParserParse Unsupported Codec 
NVMEDIA: NvMMLiteNVMEDIADecDoWork: 2228: NVMEDIA Video Dec Unsupported Stream 
NVMEDIA: NvMMLiteNVMEDIAProcessVES: 1824: NvMediaParserParse Unsupported Codec 
NVMEDIA: NvMMLiteNVMEDIADecDoWork: 2228: NVMEDIA Video Dec Unsupported Stream 
NVMEDIA: NvMMLiteNVMEDIAProcessVES: 1824: NvMediaParserParse Unsupported Codec 
ERROR from nvv4l2decoder0: Failed to process frame.
Debug info: /dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/gstv4l2videodec.c(1815): gst_v4l2_video_dec_handle_frame (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstURIDecodeBin:src_elem/GstDecodeBin:decodebin0/nvv4l2decoder:nvv4l2decoder0:
Maybe be due to not enough memory or failing driver
ERROR from nvv4l2decoder0: Failed to process frame.
Debug info: /dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/gstv4l2videodec.c(1815): gst_v4l2_video_dec_handle_frame (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstURIDecodeBin:src_elem/GstDecodeBin:decodebin0/nvv4l2decoder:nvv4l2decoder0:
Maybe be due to not enough memory or failing driver
ERROR from nvv4l2decoder0: Failed to process frame.
Debug info: /dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/gstv4l2videodec.c(1815): gst_v4l2_video_dec_handle_frame (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstURIDecodeBin:src_elem/GstDecodeBin:decodebin0/nvv4l2decoder:nvv4l2decoder0:
Maybe be due to not enough memory or failing driver
NVMEDIA: NvMMLiteNVMEDIADecDoWork: 2228: NVMEDIA Video Dec Unsupported Stream 
NVMEDIA: NvMMLiteNVMEDIAProcessVES: 1824: NvMediaParserParse Unsupported Codec 
App run failed

from the error, it is because decoding failed.

  1. can you share result of “ffprobe -i rtsp://127.0.0.1:8554/test.stream” again? wondering the current encoding type.
  2. can you see the normal video after running "ffplay rtsp://127.0.0.1:8554/test.stream "?
  3. Noticing you set h264, could you share output1.ts by running “gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! rtph264depay ! h264parse ! mux. mpegtsmux name=mux ! filesink location=output1.ts”? let me check if it can be decoded.
  1. For command one:

Input #0, rtsp, from ‘rtsp://127.0.0.1:8554/test.stream’:
Metadata:
title : No Name
Duration: N/A, start: -0.033333, bitrate: N/A
Stream #0:0: Video: h264 (High 4:4:4 Predictive), yuv444p(progressive), 640x480, 30 fps, 30 tbr, 90k tbn, 60 tbc

2.For command 2:
Yes,but with great latency in video.

3.For command 3:

设置暂停管道 …
管道正在使用且不需要 PREROLL …
进度:(open)Opening Stream
进度:(connect)Connecting to rtsp://127.0.0.1:8554/test.stream
进度:(open)Retrieving server options
进度:(open)Retrieving media info
进度:(request)SETUP stream 0
进度:(open)Opened Stream
设置播放管道 …
New clock: GstSystemClock
进度:(request)Sending PLAY request
进度:(request)Sending PLAY request
进度:(request)Sent PLAY request

output1.ts.zip (33.3 MB)

after playing output1.ts by ffplay, there is no mosaic. but playing by the following cmd without NV plugin failed.

gst-launch-1.0 filesrc  location=output1.ts ! h264parse  ! avdec_h264 ! fakesink

it should be related to the encoding parameters. if using libx264, please adapt the parameters.

Thanks,I will try to adapt it

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.