I’m given a stream of jpeg file locations which inside of the appsrc callback I decode and copy the pixel data into a gstreamer buffer and send it to the pipeline. This works fine but I wanted to experiment with using nvjpegdec to maybe speed up that part a bit.
Now inside the appsrc callback I load the bytes into memory and put that in a buffer and send it. Making sure to update the caps of appsrc to “image/jpeg”. The pipeline runs fine but only one image is ever sent. Like, if I process 20 different jpegs all the results are identical to the first jpeg. Which is really weird. I can’t share the code here but wanted to ask if this approach would even work? I dug up one previous discussion about this approach and the conclusion was to do my first approach Nvjpegdec lost buffer nvds meta - #12 by Fiona.Chen but why does it kind of work? That’s the confusing part.