DeepStream can't decode certain H265 streams properly

Hello,

I am using DeepStream 6.2 with a Tesla T4 and the official DeepStream 6.2 docker container, and I found a few issues when trying to decode H265 streams.

In particular, I am testing 3 different ways to decode the H265 streams:

  • avdec_h265 (cpu)
  • uridecodebin (gpu)
  • nvv4l2decoder (gpu)
    I am testing them on two streams. None of the decoders above works on all the streams. All of them show corrupted images on at least one of the two video streams. The corrupted images sometimes have green artifacts on them, while sometimes they show just pixelation.

In order for you to reproduce the issue, I am attaching a zip file which contains everything you need to reproduce the issue:

  • Dockerfile
  • Python code
    Once you unpack the zip file, you will simply need to run make build and make start to build and start the docker image. At the end of the execution, you will find a folder called “shared” in your current directory. The folder will contain images saved when decoding each of the two video with each of the decoders.

I also made available two public video streams that you can use to reproduce the issue. I will be happy to share the video streams with you via private message. I will also send you examples of the output of the script so that you can see the corrupted images.

The docker image is based on DeepStream 6.2 . When the docker container is started, the entrypoint start.sh will run the script main.py. This script iterates over the two streams: it tries to decode each of stream with 3 video decoders (uridecodebin, nvv4l2decoder, avdec_h265) and it saves some frames in the shared folder.

The two video streams that I will share with you are:

  • h265_test: this stream has issue with avdec_h265 (green artifacts or pixelation) and uridecodebin (green artifacts or pixelation)
  • sample_720_h265: this stream has issue with avdec_h265 (green artifacts or pixelation) and nvv4l2decoder (severe pixelation)

Note that the results are not deterministic. You may have to run the docker container a few times. However, I almost always have either green artifacts or pixelation on the images.

However, I am able to correctly view the two video streams using VLC.

sample_nvidia.zip (14.1 KB)

Let’s narrow down the scope of this issue. Could you just attach the piepline with gst-launch-1.0 command and the video sources to us?

Hello, I am working on the same project with @mfoglio. Below are the gst pipelines related to each decoding methods we have, along with a short description of the issues we are seeing with each decoder and what stream has the issue. They are written as a shell script.

As for the source URLs, I will DM you the URLs, as we would like to have as minimal people with access to them as possible.

CPU

causes half of the screen in h265_test to have green corruptions (solid green). causes sample_720_h265 to have severe

pixelated corruptions and also green color corruptions

gst-launch-1.0 rtspsrc location="RTSP_URL" protocols="tcp" latency=1000 drop-on-latency=1 timeout=5000000 debug=1 ! \
rtph265depay ! \
h265parse ! \
avdec_h265 output-corrupt=0 ! \
queue leaky=2 max-size-buffers=1 ! \
nvvideoconvert nvbuf-memory-type=3 output-buffers=1 ! \
capsfilter caps="video/x-raw(memory:NVMM),format=RGBA" ! \
nvvideoconvert ! \
capsfilter ! \
jpegenc ! \
multifilesink location=images_test/image_%06d.jpg

GPU CUDA

works with h265_test. Corrupts sample_720_h265 with pixels

gst-launch-1.0 rtspsrc location="RTSP_URL" protocols="tcp" latency=1000 drop-on-latency=1 timeout=5000000 debug=1 ! \
rtph265depay ! \
h265parse ! \
nvv4l2decoder cudadec-memtype=2 num-extra-surfaces=1 ! \
queue leaky=2 max-size-buffers=1 ! \
nvvideoconvert nvbuf-memory-type=3 output-buffers=1 ! \
capsfilter caps="video/x-raw(memory:NVMM),format=RGBA" ! \
nvvideoconvert ! \
capsfilter ! \
jpegenc ! \
multifilesink location=images_test/image_%06d.jpg

URIDECODEBIN

works with sample_720_h265, shows green corruptions with h265_test

gst-launch-1.0 uridecodebin uri="RTSP_URL" ! \
queue leaky=2 max-size-buffers=1 ! \
nvvideoconvert nvbuf-memory-type=3 output-buffers=1 ! \
capsfilter caps="video/x-raw(memory:NVMM),format=RGBA" ! \
nvvideoconvert ! \
capsfilter ! \
jpegenc ! \
multifilesink location=images_test/image_%06d.jpg

Thank you for your help!

From the description, it may be more closely related to the gstreamer rtsp. The packets are dropped due to network factors. Are the scenarios you described for these questions inevitable?
You can just save the h265 stream to verify that.

gst-launch-1.0 <your different source> ! \
rtph265depay ! \
h265parse ! 'video/x-h265,stream-format=byte-stream' !  \
filesink location=test.h265

Thank you for your reply,

I do not think that this would be an issue with the connection. My main reasoning is that different decoders work perfectly fine on the same live streams we provide. As shown in our sample above, nvv4l2decoder works fine with the livestream labeled h265_test but corrupts the livestream labeled sample_720_h265. While URIDECODEBIN does work with sample_720_h265 but corrupts h265_test.

As for the scenarios being inevitable, Yes they are. We are building this project to work with livestreams and will not be using them to analyze prerecorded videos or files.

It may be an issue with IDR frames or I frame interval issue. Could you tell us how to set up your server?

Hey @yuweiw , we don’t have information about how the server is set up. But I can tell you that we have the same issue when connecting directly to the camera.

I also tried using the same pipelines in DeepStream 6.4.
Interestingly, results were different for the nvv4l2decoder and uridecodebin pipelines, while results were the same for avdec_h265.

Here’s a table that summarizes my findings:

Screenshot 2023-12-19 at 6.22.30 PM

I try that with deepstream 6.4. The uridecodebin with h265_test works well. But the rtspsrc with h265_test results in the pipeline freeze. I download the h265 stream with the pipeline I attached and use the filesrc. They both works well.
So this may be your rtsp source and the rtspsrc plugin don’t match well. You can adjust your rtspsrc pipeline based on the pipeline I attached below.

Hello @yuweiw , thank you for your reply.
Could you attach a pdf of the pipeline or a higher resolution image? I can’t read the text in the image. Also, what is that pipeline exactly?

Please note that with DeepStream 6.4 and uridecodebin h265_test can be played, but you’ll notice that most (if not all of the frames) are suffering from bad pixelation. This is not happening when the video can be decoded fine (for instance: Deepstream 6.2 and nvv4l2decoder). I repeated this test multiple times and I always got pixelated images. Could you double check?

The pipeline is similar to yours:

gst-launch-1.0 uridecodebin uri=xxx !  \
queue leaky=2 max-size-buffers=1 ! \ 
nvvideoconvert nvbuf-memory-type=3 output-buffers=1 ! \
capsfilter caps="video/x-raw(memory:NVMM),format=RGBA" ! \
nvvideoconvert ! nvv4l2h265enc ! h265parse ! \
'video/x-h265,stream-format=byte-stream' ! filesink location=test.h265

And you can refer to the FAQ to get the higher resolution yourself. From the generated H265 file, it appears that there is no pixelated images in the video.
test.h265 (3.5 MB)

@mfoglio @bkh1
there are “corrupted” and “freeze” issue when testing nvv4l2decoder in DS6.4 while testing in DS6.2 is ok. after checking, DS6.4 is using Gstreamer 1.20.3 while DS6.2 is using 1.16.3. there are big modifications in new h265parse plugin, nvv4l2decoder will get different packets. when I rebuilt so by porting 1.16.3 code to 1.20.3, the issue was fixed.

here is my new solibgstvideoparsersbad.so (1.2 MB) . you also can build it yourself(cp h265pasrse code to 1.20.3, then rebuild gst-plugins-bad).

here is the test method:

mv /usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstvideoparsersbad.so /usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstvideoparsersbad.so_ori
mv libgstvideoparsersbad.so /usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstvideoparsersbad.so
gst-launch-1.0 rtspsrc location=rtsp://xxx protocols="tcp" latency=1000 drop-on-latency=1 timeout=5000000 debug=1 ! rtph265depay ! h265parse ! nvv4l2decoder cudadec-memtype=2 num-extra-surfaces=1 ! nvvideoconvert ! fpsdisplaysink

Hi @fanzh , thank you for your help.
Let’s stick to DeepStream 6.2 for now.
Your pipeline only works with one of the two video streams: in fact, it is decoding the video using the same pipeline that I am using. The only difference between my pipeline and yours it’s the way the video is saved to the device.
As I wrote in previous post here , the pipeline with uridecodebin works only with the sample_720_h265 stream but not with the h265_test in DeepStream 6.2. The same result is true with your pipeline. You can verify by running your pipeline on the h265_test stream.
You can find the output attached.
test.h265.zip (2.6 MB)

About the DeepStream 6.4 libgstvideoparsersbad.so fix: I can’t get it to run.
I start the container like this:

sudo docker run \
    -it \
    --rm \
    --net=host \
    --ipc=host \
    --gpus all \
    --shm-size 10G \
    -e DISPLAY=$DISPLAY \
    -v /tmp/.X11-unix/:/tmp/.X11-unix \
    -v /home/ubuntu/pycharm/projects/fususcore-ai-detector-deepstream/shared:/shared \
    --cap-add=SYS_PTRACE \
    --security-opt seccomp=unconfined \
    --device /dev/video0 \
    --privileged \
    --expose 8554 \
    nvcr.io/nvidia/deepstream:6.4-gc-triton-devel \
    bash

Then I replace the libgstvideoparsersbad.so with yours:

cd /usr/lib/x86_64-linux-gnu/gstreamer-1.0/
rm libgstvideoparsersbad.so
wget https://forums.developer.nvidia.com/uploads/short-url/z7mMwapBKqmlmibUZWW989DKWsZ.so -o libgstvideoparsersbad.so

When I run the pipeline, I get the following error:

root@ip-172-31-9-127:/usr/lib/x86_64-linux-gnu/gstreamer-1.0#     gst-launch-1.0 rtspsrc location=${rtsp_url} protocols="tcp" latency=1000 drop-on-latency=1 timeout=5000000 debug=1 ! \
      rtph265depay ! \
      h265parse ! \
      nvv4l2decoder cudadec-memtype=2 num-extra-surfaces=1 ! \
      queue leaky=2 max-size-buffers=1 ! \
      nvvideoconvert nvbuf-memory-type=3 output-buffers=1 ! \
      capsfilter caps="video/x-raw(memory:NVMM),format=RGBA" ! \
      nvvideoconvert ! \
      capsfilter ! \
      jpegenc ! \
      multifilesink location=$output_folder/$video_name/nvv4l2decoder/image_%06d.jpg
      

(gst-plugin-scanner:114): GStreamer-WARNING **: 16:52:20.416: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstvideoparsersbad.so': /usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstvideoparsersbad.so: invalid ELF header

(gst-plugin-scanner:114): GStreamer-WARNING **: 16:52:20.440: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_udp.so': librivermax.so.1: cannot open shared object file: No such file or directory

(gst-plugin-scanner:118): GStreamer-WARNING **: 16:52:20.691: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstchromaprint.so': libavcodec.so.58: cannot open shared object file: No such file or directory

(gst-plugin-scanner:118): GStreamer-WARNING **: 16:52:20.693: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstmpeg2enc.so': libmpeg2encpp-2.1.so.0: cannot open shared object file: No such file or directory

(gst-plugin-scanner:118): GStreamer-WARNING **: 16:52:20.699: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstmpg123.so': libmpg123.so.0: cannot open shared object file: No such file or directory

(gst-plugin-scanner:118): GStreamer-WARNING **: 16:52:20.721: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstopenmpt.so': libmpg123.so.0: cannot open shared object file: No such file or directory

(gst-plugin-scanner:118): GStreamer-WARNING **: 16:52:20.722: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstmpeg2dec.so': libmpeg2.so.0: cannot open shared object file: No such file or directory
WARNING: erroneous pipeline: no element "h265parse"

Do I need to compile something? If so, how? Can you share the code that you used to rebuild libgstvideoparsersbad.so?

Please note that I am not interested to run the pipeline in DeepStream 6.4 for the sake of it. If we find a fix for DeepStream 6.2 that’s fine as well for now. My goal is to decode the h265 video streams above. As explained in my previous post, the pipeline you provided here is not working with the sample_720_h265 video stream with DeepStream 6.2. Did you have a chance to try both the video stream in DeepStream 6.4 with your fix?

Thank you for your help

this command-line is not correct. please download it by clicking the link. I validated in nvcr.io/nvidia/deepstream:6.4-triton-multiarch docker.

OK, Let’s stick to DeepStream 6.2 for now. we will check, then get back.

In DS6.2(nvcr.io/nvidia/deepstream:6.2-devel) container, I tested rtsp sample_720_h265and h265_test, with the following pipeline, which is mentioned in my last comment.

gst-launch-1.0 rtspsrc location=rtsp://xxx protocols="tcp" latency=1000 drop-on-latency=1 timeout=5000000 debug=1 ! rtph265depay ! h265parse ! nvv4l2decoder cudadec-memtype=2 num-extra-surfaces=1 ! nvvideoconvert ! fpsdisplaysink

the pipeline worked very well, no any green corruptions. please refer the screenshot.zip (562.0 KB).
can you double check? uridecodebin is a Gstreamer opensource plugin, you can dump uridecodebin’s pipeline, and compare the pipelines to narrow down this issue.

Hi @fanzh,

Could you try repeat your test on sample_720_h265 by saving a video?
I edited your last pipeline to save a video instead of displaying it. I am working on a remote machine so I cannot display a video:

gst-launch-1.0 rtspsrc location=${rtsp_url} protocols="tcp" latency=1000 drop-on-latency=1 timeout=5000000 debug=1 ! \
  rtph265depay ! \
  h265parse ! \
  nvv4l2decoder cudadec-memtype=2 num-extra-surfaces=1 ! \
  nvvideoconvert ! \
  capsfilter ! \
  nvv4l2h265enc ! \
  h265parse ! \
  'video/x-h265,stream-format=byte-stream' ! \
  filesink location=$output_folder/$video_name/nvv4l2decoder/video.mp4

This is the file that was generated:
nvidia_cleaned_container.mp4.zip (29.7 MB)

At second 13 there is some heavy pixelation.

I run this pipeline a few times. Every time I get a different output in the sense that pixelation appears at different timestamps. However, if you do record at least 60 seconds of video, you should always be able to see some pixelation in the video according to my tests.

Thank you

Please note that the pixelation described in my last post happens also when running deepstream on the same machine that is generating the video stream. Based on this, we can rule out connection/bandwidth issues.
This is also confirmed by the fact that I can play with the public remote stream on my own laptop, without pixelation issues, using VLC.

  1. In DS6.2, I made a video recording by third-party tool when testing the pipeline on Dec 29. two-record.zip (46.1 MB). at least we can confirm there is no decoding issue.
  2. after encoding to h265, there are some pixelation/mosaic in the output video. it seemed to be an encoding issue. will continue to check. if using jpegenc to encoding to jpg, does pixelation still appear?
    image

Hi @fanzh , thank you for your support.
Yes, when using jpegenc the pixelation still appears.
As a matter of fact, I originally noticed the pixelation for the first time when using python and retrieving the frame with pyds.get_nvds_buf_surface(...). So both facts make me think it’s not an encoding issue.
Let me know if you need any other information for debug the issue
Thank you