I`m tried decode interlced video stream. But decoder process frames slow. How to send interlaced frame to the decoder right? May be any flags needed? How to enable interlace in decoder through mmapi?
There is a deinterlace element in gstreamer, but it depends on the video stream having
the proper flags. This is dependent on the video capture driver. I don’t know about
mmapi. Note that this deinterlace element has several methods of post-processing the
images, some of which may be less cpu intensive than others.
exactly which flags are needed?
The video source would have to indicate the data is interlaced
in the caps
Somehow the video source would have to set flags in the video buffer
// Mark the video characteristics GST_BUFFER_FLAG_SET (self->stored_frame, GST_VIDEO_BUFFER_FLAG_INTERLACED); GST_BUFFER_FLAG_SET (self->stored_frame, GST_VIDEO_BUFFER_FLAG_TFF);
At which point the standard “deinterlace” gstreamer element could be used.
There is a little more in the gstreamer documentation here:
I’m trying set flags on v4l2_buffer:
v4l2_buf.field |= V4L2_FIELD_INTERLACED_TB; v4l2_buf.field |= V4L2_FIELD_INTERLACED;
You will have to make sure the CAPS for the stream contain interlace information.
You will probably have to enable debug on the deinterlace element to figure out
what is going on. Note that I believe that the deinterlace element expects the
lines to be re-assembled in the proper order and just tries to adjust the
jagged edges, but it may work in other modes too. Consult the gstreamer documentation
and source code for details. Note also the latest release of gstreamer appears
to have improvements in de-interlace capabilities.