How to h264encode YUV422 pixel formatted input data and put compressed data into MP4 file format by using NVidia Jetson TX2 board


I am developing a simple application where a captured video data from camera in yuv222 pixel format to be compressed in h264 and put into Mp4 file container without audio. This development is to take place in Ubuntu
Can you suggest a encoder API
and put h264 data output from encoder API into Mp4 file container?

Also can I have config file where all video encoder parameters like frame size, bit rate, QP(constant or variable) and profile(like baseline) etc can be set or choose default values. If default values are already there, what are those.

From L4T Multimedia API Ref doc it is very confusing.
Like NvVideoEncoder::setOutputPlaneFormat(uint32_t pixfmt,uint32_t width, unit32_t height)
Pixfmt : one of the raw V4L2 pixel formats.
Question here is what is the raw v4L2 pixelformat? Not mentioned there.

I want Encoder function where I can provide input YUV422 buffer or input file in YUV422 and Encoder config file.
Is it possible?
As user I don’t care other things.
Please guide me as this application is very important to build quickly.

Also can you suggest yuv player for ubuntu 16.04

Hi amaji,
The HW encoder does not support YUV422 as input. You need to convert it to YUV420 or NV12.

Beside MM APIs, we also support gstreamer. Please also refer to

What is your camera source?

Hi Dane,
Thanks for your reply.
I am using pylon camera.
I will convert from yuv422 to yuv420 It is not a big deal.
we dont want to use gstreamer.
You have not answered my basic question.

  1. can you suggest a h264 Encode Function using GPO to the out put of the Basler usb camera’s output data
    you have not answered to my another question. I repeat here.

NvVideoEncoder::setOutputPlaneFormat(uint32_t pixfmt,uint32_t width, unit32_t height)
2. what is the raw v4L2 pixelformats?
3.what is the OutputPlane?
4. what is the CapturePlane?
5.can you suggest yuv player for ubuntu 16.04?

MMAPI is basically an extension of low-level V4L2 API to accelerate Jetson multimedia decode, encode and format conversion functions. For each main function such as encoder, MMAPI implements a class library for it. For instance NvVideoEncoder class has quite a lot member functions. setOutputPlaneFormat and setCapturePlaneFormat are 2 of them. The former set the format of encoder output plane and the latter for capture plane. Can treat them as input and output buffer depending on which element it relates to. Raw pixel format usually refers to original image output from camera. If your camera is raw sensor, the output is raw format. In this case, a further processing is required for the image data to be human eyes view-able. If you camera is yuv sensor, the output is YUV format. You could google for yuv player under Linux. Dane can also check if we have a commonly-used one.

Are you reference our MM API documentation? If not you definitely should. It’s available after Jetpack installation. From there, you could have a deep view of our class implementation and header etc besides sample program and API. Also, helpful to have a copy of standard Linux V4L2 spec handy. Good luck!

Hii Chijen,

Thanks for your answer.
But my 2nd question is not answered.
I will put it this way,

NvVideoEncoder::setOutputPlaneFormat(uint32_t pixfmt,uint32_t width, unit32_t height)

where pixfmt is one of the raw v4L2 pixelformats.

  1. I wanted to know the list of the raw v4L2 pixelformats.

or give me the reference in the documentation

  1. for Mp4 I need to get done from FFMPEG libs

Hi amaji,
Please install tegra_multimedia_api sample and refer to tegra_multimedia_api\samples\12_camera_v4l2_cuda

Also a sample with NvVideoEncoder @

The output is raw h264(or h265) stream. You need to implement muxer to mux into MP4.

The v4l2 pixel format to encoder must be V4L2_PIX_FMT_YUV420M.
We don’t support HW acceleration in FFMPEG.