Encode Sample not Producing Valid Output

Hi all,

I’m trying to get one of the basic samples working in the Video Codec SDK: AppEncD3D11. I’m able to compile the sample project without error but the resulting application isn’t producing valid output in my testing so I’m wondering where the breakdown lies.

From the help text, the input file must be in raw BGRA format, so I’ve converted a regular video with FFmpeg for testing using the following command:

ffmpeg.exe -i “C:\sample-video.mp4” -c:v rawvideo -pix_fmt bgra -an “C:\sample-video.avi”

I then run the sample application using the command:

AppEncD3D11.exe -i “C:\sample-video.avi” -o “C:\result.mp4” -s 320x240 -fps 15

which completes again without error, reporting:

Total frames encoded: 150
Saved in file C:\result.mp4

However, I can’t play this file in any video player nor read any metadata from it! I’ve attached this video as well as the intermediate/input BGRA video file for reference but am hoping I’ve just missed something about the process.

Thanks!

Media: Videos.zip (20.5 MB)
Additional Information:

GPU: Quadro RTX 6000
Driver Version: 461.92
Video Codec SDK: 11.0.10

Hi.
Input to AppEncD3D11.exe is expected to be raw RGB data.
Here is what the source code says,
/**

  • This sample application illustrates encoding of frames in ID3D11Texture2D textures.
  • There are 2 modes of operation demonstrated in this application.
  • In the default mode application reads RGB data from file and copies it to D3D11 textures
  • obtained from the encoder using NvEncoder::GetNextInputFrame() and the RGB texture is
  • submitted to NVENC for encoding. In the second case ("-nv12" option) the application converts
  • RGB textures to NV12 textures using DXVA’s VideoProcessBlt API call and the NV12 texture is
  • submitted for encoding.
  • This sample application also illustrates the use of video memory buffer allocated
  • by the application to get the NVENC hardware output. This feature can be used
  • for H264 ME-only mode, H264 encode and HEVC encode.
    */

Specifically, refer to ReadInputFrame() regarding how the frame is constructed and passed to encoder.

Thanks.

Thanks Mandar,

That aspect was clear enough to me. Here’s the part of the response to AppEncD3D11.exe -h that gives another reference point as well:

-i . . . . . Input file (must be in BGRA format) path

So is your suggestion that my FFmpeg command to produce a test file above isn’t producing the BGRA frames in the correct format? Can you point to another command or even attach a valid file to use as input since I’m really just trying to test the provided samples and the only ‘raw video’ provided by NVIDIA on the SDK page is in YUV format:

Thanks again for the quick response!

Hi @ben49

You can use the following ffmpeg command to decode compressed video and convert it to BGRA:

ffmpeg.exe -i big_buck_bunny_1080p_h264.mov -c:v rawvideo -pix_fmt bgra -y BigBuckBunny.rgb

Never mind that BGRA file has .rgb extension - it’s just one of ffmpeg quirks. It tries to guess the output format based on file extension.

Then you can encode this BGRA file with D3D11 encode sample like follows:

AppEncD3D11.exe -i BigBuckBunny.rgb -o BigBuckBunny.h264 -s 1920x1080 -fps 15

Please note that encoding sample doesn’t put video file into container, it’s H.264 / H.265 Annex.B elementary bitstream format. You can open this type of format with VLC player.

Thank you!
I swear I had tried that FFmpeg command previously (with the .rgb extension) but had an error returned, but it does work now. The .h264 output file extension is also very helpful to point out.
I’m now able to run the samples with successful output so will mark this closed :)