H264 decoding using Tegra Multimedia API

I have H264 nals. I want to feed it to the decoder and get back raw frames in a buffer. I was using ffmpeg decode to do this task. But since TX2 does not support ffmpeg I need to switch to Tegra multimedia API. Can anybody give me or direct me to some c++ samples for the above task?

Thanks in advance.

BTW, for my above task which one would be most suitable for TX2? gstreamer or tegra nvcodec? my guess was nvcodec (tegra multimedia api).

Please install tegra_multimedia_api samples via Jetpack and try to run 00_video_decode.

Thanks for your reply.
what is the difference between 00_video_decode and 02_video_dec_cuda?

Hi,
02_video_dec_cuda demonstrates how to do post-processing via CUDA programming.

Thanks. Having a bit of issue in creating a test decoding project in Qt. Need help. Installed multimedia api. Now just trying to create a decoder after initializing the context following backend sample code and I am getting these errors:

error: undefined reference to NvVideoDecoder::createVideoDecoder(char const*, int)' error: undefined reference to NvV4l2Element::subscribeEvent(unsigned int, unsigned int, unsigned int)’
error: undefined reference to NvV4l2ElementPlane::stopDQThread()' error: undefined reference to NvV4l2ElementPlane::stopDQThread()’

I believe these are linking issues.

I have this in my .pro project file.

LIB += -L/usr/lib/aarch64-linux-gnu/tegra -lnvosd
-L/usr/lib/aarch64-linux-gnu/tegra -lv4l2
-L/usr/lib/aarch64-linux-gnu/tegra -lEGL
-L/usr/lib/aarch64-linux-gnu/tegra -lGLESv2
-L/usr/lib/aarch64-linux-gnu/tegra -lX11
-L/usr/lib/aarch64-linux-gnu/tegra -lpthread
-L/usr/lib/aarch64-linux-gnu/tegra -lnvbuf_utils
-L/usr/lib/aarch64-linux-gnu/tegra -lnvjpeg
-L/usr/lib/aarch64-linux-gnu/tegra -ldrm
-L/usr/lib/aarch64-linux-gnu/tegra -lnvinfer
-L/usr/lib/aarch64-linux-gnu/tegra -lnvparsers

What am I missing?

Looks like you do not compile tegra_multimedia_api\samples\common

If you install tegra_multimedia_api samples via Jetpack, it automatically compiles the samples. You may refer to it.

what do mean by “refer to it”? Yes I installed via jetpack and I see its compiled and all the .o files are there.
Do I need to include the corresponding .cpp files into the qt project from common?

I included the CPPs from common. Now I am getting these two error

tegra_multimedia_api/samples/common/classes/NvJpegDecoder.cpp:95: error: ‘struct jpeg_decompress_struct’ has no member named ‘IsVendorbuf’
cinfo.IsVendorbuf = TRUE;
^

tegra_multimedia_api/samples/common/classes/NvJpegDecoder.cpp:127: error: ‘struct jpeg_decompress_struct’ has no member named ‘fd’
fd = cinfo.fd;
^

Hi,
If you install the whole system via Jetpack, tegra_multimedia_api samples are compiled without any issue. The build environment is well configured to compile the samples. You can do comparison and see what is wrong in your build environment.

Okay, I got a bit of progress.
I can compile my project in qt now. This is what I am doing. I am grabbing h264 encoded frames from an IP camera, decomposing the encoded buffer by myself to get SPS, PPS and frame buffers. Then I am packing with each of these with starting prefix 0x00,0x00,0x00,0x01 and feeding this to the decoder as NAL units. I am initializing output_plane as

ctx[channel].dec->output_plane.setupPlane(V4L2_MEMORY_MMAP, 10, true, false);

But when all 10 output_plane buffer are en queued and the program starts to dequeue first before putting more frames into the queue, it stucks in

ret = v4l2_ioctl(fd, VIDIOC_DQBUF, &v4l2_buf);

of

int
NvV4l2ElementPlane::dqBuffer(struct v4l2_buffer &v4l2_buf, NvBuffer ** buffer,
NvBuffer ** shared_buffer, uint32_t num_retries)

function.

I tried using or not using
ret = ctx[channel].dec->disableCompleteFrameInputBuffer();

but no luck. It stuck always. difference of using disableCompleteFrameInputBuffer() is the v4l2_ioctl() function succeeds upto 3rd frame (i.e., SPS, PPS, iFrame), without disableCompleteFrameInputBuffer() it stuck after dequeuing only SPS and PPS.

Need help.

BTW, for your info, my program at the moment is only feeding h264 encoded frames into the decoder. There is no code yet to read and use the decoded buffer.

More info:
This is what I get in my qt application output window, if it helps:

NvMMLiteOpen : Block : BlockType = 261
TVMR: NvMMLiteTVMRDecBlockOpen: 7647: NvMMLiteBlockOpen
NvMMLiteBlockCreate : Block : BlockType = 261
TVMR: cbBeginSequence: 1179: BeginSequence 1280x720, bVPR = 0
TVMR: LowCorner Frequency = 0
TVMR: cbBeginSequence: 1529: DecodeBuffers = 6, pnvsi->eCodec = 4, codec = 0
TVMR: cbBeginSequence: 1600: Display Resolution : (1280x720)
TVMR: cbBeginSequence: 1601: Display Aspect Ratio : (1280x720)
TVMR: cbBeginSequence: 1669: ColorFormat : 5
TVMR: cbBeginSequence:1680 ColorSpace = NvColorSpace_YCbCr709
TVMR: cbBeginSequence: 1809: SurfaceLayout = 3
TVMR: cbBeginSequence: 1902: NumOfSurfaces = 13, InteraceStream = 0, InterlaceEnabled = 0, bSecure = 0, MVC = 0 Semiplanar = 1, bReinit = 1, BitDepthForSurface = 8 LumaBitDepth = 8, ChromaBitDepth = 8, ChromaFormat = 5
TVMR: cbBeginSequence: 1904: BeginSequence ColorPrimaries = 1, TransferCharacteristics = 1, MatrixCoefficients = 1

Hi,
If you have confirmed it is an issue in NVIDIA BSP release, please share us how to reproduce it on reference samples. For your case, you can attach the h264 stream so that we can reproduce your issue in running 00_video_decode.

Hi,
Can you share how you compile your project with tegra multimedia api ? I have error like you .
Thank !

Hi,
Hope other users can share their experience. You may also compare your environment to the environment installed via sdkmanger. The samples are compiled in the installation.