Nvvidconv not handling image stride

Hello, I noticed that nvvidconv doesn’t seem to handle image stride in GStreamer pipelines properly. This can be verified running the following pipeline, where Gstreamer will automatically add a stride to the buffer to make it’s width a multiple of 4:

gst-launch-1.0 videotestsrc ! video/x-raw,width=321,height=240 ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! nvoverlaysink

In Gstreamer, the stride of each GstBuffer plane, when different from the width, is communicated to the processing element using a GstVideoMeta structure. This element should take this meta out of the buffer, parse these strides and use them accordingly. This isn’t the case with the nvvidconv, which seems to ignore the stride metadata and, instead, uses the plain width to perform the processing. This results in a highly distorted output, since the beginning of the image rows are no longer what they are supposed to be. Besides the stride, there are a bunch of other fields that may improve performance as well.

Is this a known limitation? Is there any plan to address this in the near future?

Not sure for your case, but I think that:

  • Odd sizes (eg: width=321 are not supported.
  • You didn’t set format, but stride might be depending on it (for example, NV12 may use 256 pixels stride, but RGBA wouldn’t).

Thank you for your response, I didn’t know odd values were not supported! But, if I’m not mistaken, the “actual” line length is not odd, since GStreamer will pad to an even stride. I couldn’t think of another way to reproduce the problem.

In my case use case, I’m the element actually sending the stride in the GstVideoMeta, so I’m certain it’s being ignored.

Hi,
Due to limitation in hardware alignment, we cannot allocate video/x-raw(memory:NVMM) buffers in 321x240(or odd-value width), so the value will be aligned in nvvidconv plugin.

Thanks for your response @DaneLLL, that’s is actually useful to know.

However my question goes more in the line of the nvvidconv ignoring GstMetas with stride information. I realize the pipeline I posted is not the best example. Let me prepare a simple app.