Unable to start Yolo8 deepstream with MJpeg AVI

Thanks for you support!

As you might have noticed and tried to show was that I recreated a film from JPEGs using

ffmpeg -framerate 30 -pattern_type glob -i ‘./images/*.jpg’ -c:v libx264 -pix_fmt yuvj420p output_video.mp4

While waiting I tried to create a playable pipeline of below and saw that avdec_h264 decoder wasnt there. And I also took the time to learn more about gstreamer pipelines. So I tried this:

gst-launch-1.0 filesrc location=output_video.mp4 ! qtdemux name=demux demux.video_0 ! queue ! h264parse ! avdec_h264 ! nveglglessink -e

I got it to work after finding out that gst-inspect could not find the avdec_h264 decoder. After some work I found below. So I did this:

export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libgomp.so.1

and reloaded bash

source ~/.bashrc

then I could run the video in console with gst-launch. Problem still remains regarding the deepstream-test1.py, or I have not had time to test. I will see if I can modify the pipeline so that the input is parsed properly as describe by you.! If you make it work though with three streams I would be happy as a penguin on ice.

If I get this working I will make two very good tutorials on this matter. Both Mpeg4 streaming for evaluating models and second how to take in 3 USB-camera using v4l2src and running inference.
A reason for going with deepstream is a hope that it is more stable and faster, running inference and saving the box output, than using OpenCV.