4K Performance Low Using e-CAM130 13MP Camera

I’ve got a Jetson TK1 running in text mode pulling and storing video frames to the eMMC using gstreamer from an e-CAM130 from e-con. The docs from the camera manufacturer indicate that the camera can pull 4K video at 22 fps, but I’m getting 6 fps max. When pulling frames at 1080p, I get the full 30 fps performance. In both situations, top indicates that only 75% of one CPU core is being utilized, which seems to indicate that my pipeline setup is using the HW encoder properly. Has anyone been successful encoding 4K video from an x-raw-yuv source? Here are the commands I’m using:

1080p encode / store to eMMC at 30fps:

sudo nice --1 gst-launch-0.10 v4l2src device=/dev/video0 queue-size=5 always-copy=false ! 'video/x-raw-yuv, format=(fourcc)YUY2, width=(int)1920, height=(int)1080, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1' ! nvvidconv ! 'video/x-nv-yuv, format=(fourcc)I420, width=(int)1920, height=(int)1080' ! nv_omx_h264enc ! matroskamux ! queue ! filesink location=test.mkv

4K encode/ store to eMMC that is rated at 22fps from the manufacturer, but only getting 6fps:

sudo nice --1 gst-launch-0.10 v4l2src device=/dev/video0 queue-size=5 always-copy=false ! 'video/x-raw-yuv, format=(fourcc)YUY2, width=(int)3840, height=(int)2160, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1' ! nvvidconv ! 'video/x-nv-yuv, format=(fourcc)I420, width=(int)3840, height=(int)2160' ! nv_omx_h264enc ! matroskamux ! queue ! filesink location=test.mkv

I’m not familiar with the “nice --1”, would you try as:

sudo nice -n -1 ...

…does this change anything?

Nope! Same functionality. nice --1 is equivalent to nice -n -1

We have checked the gstreamer pipelines and we are able to get 4K@24 fps from sensor

gst-launch-0.10 v4l2src device=/dev/video0 queue-size=5 always-copy=false ! “video/x-raw-yuv, format=(fourcc)YUY2, width=(int)3840, height=(int)2160, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1” ! fakesink -v

Encoder capable of performing 4K@30fps.
gst-launch-0.10 videotestsrc ! “video/x-raw-yuv, format=(fourcc)I420, width=(int)3840, height=(int)2160” ! nv_omx_vp8enc ! fakesink -v (Able to get 30fps encoded images)

We have a problem with nvvidconv block, which drops frame rate from 24fps to 7fps in the encoding process (Since encoder will take only 420 as input, color conversion block needed to convert yuv422 to yuv420).

My Questions

  1. What is the maximum capability of nvvidconv block to perform color conversion from YUV422 to YUV420 ?
  2. From sensor YUV420 support is available like this for 640x480 image
    YYYYYYYY… (640 Y data)
    YUYVYUYV
    YUYVYUYV… (640 Y+ 320 U +320 V data)
    Do we have any mipi identifiers in tegra to take like this and convert it into NV12/YUV420 planar format in the memory ?

Thanks and regards,
Ananth