I have interlaced mipi video source. So far I’m able to capture both even and odd fields as if it was progressive video - without deinterlacing.
Now I would like to deinterlace it. I know (maybe I’m wrong) it is not possible to do this in hw and it has to be done in sw. Would someone be able to share example gstreamer pipeline which would do this?
Yes, this is the only think I know. Let me specify my question.
What has to be device tree for interlaced mode like? Is there something what has to be specified in device tree node for interlaced mode? I found nothing about interlaced format here .
How am i supposed to specify which frame is top one and which is bottom one?
I tried to run following pipeline but ended with segmentation error ( SIGSEGV exception).
Hi,
Are you able to capture through v4l2src? There is no complete path of deinterlacing when using nvv4l2camerasrc. You need to use v4l2src to run like:
When I try to capture video using v4l2src plugin I end up with Internal data stream error.
How am I supposed to make my CSI source work with this plugin? Is there something specific what has to be written in device tree or camera driver? According to this the v4l2src is not available (or prefered) when having CSI interface.
Hi,
Are you able to capture frame data through v4l2-ctl commands? You would need to make sure the sensor driver/device tree is ready by running v4l2-ctl commands successfully. We have sensor driver programming guide in developer guide but it is mainly for progressive frame data.
Hi,
Do you see 768x576 UYVY 60fps is listed in $ v4l2-ctl --list-formats-ext?
Generally most use-cases are progressive inputs. We don’t have much experience about interlaced inputs and this would need other users to provide guidance. While googling for information, it seems like this pipeline may work: