I’m given a stream of jpeg file locations which inside of the appsrc callback I decode and copy the pixel data into a gstreamer buffer and send it to the pipeline. This works fine but I wanted to experiment with using nvjpegdec to maybe speed up that part a bit.
Now inside the appsrc callback I load the bytes into memory and put that in a buffer and send it. Making sure to update the caps of appsrc to “image/jpeg”. The pipeline runs fine but only one image is ever sent. Like, if I process 20 different jpegs all the results are identical to the first jpeg. Which is really weird. I can’t share the code here but wanted to ask if this approach would even work? I dug up one previous discussion about this approach and the conclusion was to do my first approach Nvjpegdec lost buffer nvds meta - #12 by Fiona.Chen but why does it kind of work? That’s the confusing part.
Nvjpegdec is indeed returning the same image when used to decode a series of images. A fix is available but upgrading the Jetpack version I’m using isn’t an option for now.
mjpeg support needs to be explicitly enabled to support this use case. I didn’t understand what mjpeg was and didn’t think my use case applied but it did. So I’m able to use nvv4l2decoder to decode jpeg images.