The documentation in nvEncodeApi.h says the following about encodeConfig
:
Specifies the advanced codec specific structure. If client has
sent a valid codec config structure, it will override parameters
set by the NV_ENC_INITIALIZE_PARAMS::presetGUID parameter. If set
to NULL the NvEncodeAPI interface will use the NV_ENC_INITIALIZE_PARAMS::presetGUID
to set the codec specific parameters. Client can also optionally
query the NvEncodeAPI interface to get codec specific parameters
for a presetGUID using ::NvEncGetEncodePresetConfig() API. It can
then modify (if required) some of the codec config parameters and
send down a custom config structure as part of
::_NV_ENC_INITIALIZE_PARAMS. Even in this case client is
recommended to pass the same preset guid it has used in
::NvEncGetEncodePresetConfig() API to query the config structure;
as NV_ENC_INITIALIZE_PARAMS::presetGUID. This will not override
the custom config structure but will be used to determine other
Encoder HW specific parameters not exposed in the API.
… therefore I would expect that setting this encodeConfig
member to NULL would initialize an encoder session. Though when I set this member to NULL, I get the NV_ENC_ERR_INVALID_DEVICE
error. I’ve pasted the relevant parts of my test below:
NV_ENC_CONFIG enc_cfg = {};
NV_ENC_PRESET_CONFIG preset_cfg = {};
preset_cfg.version = NV_ENC_PRESET_CONFIG_VER;
preset_cfg.presetCfg.version = NV_ENC_CONFIG_VER;
status = ctx->funcs.nvEncGetEncodePresetConfigEx(
ctx->session,
NV_ENC_CODEC_H264_GUID,
NV_ENC_PRESET_P3_GUID,
NV_ENC_TUNING_INFO_ULTRA_LOW_LATENCY,
&preset_cfg
);
if (NV_ENC_SUCCESS != status) {
RXE("Failed to get the encoder preset config.");
return -1;
}
/* Setup the initialization settings, based on `NvEncoder::CreateDefaultEncoderParams()`. */
NV_ENC_INITIALIZE_PARAMS init_cfg = {};
init_cfg.version = NV_ENC_INITIALIZE_PARAMS_VER;
init_cfg.encodeGUID = NV_ENC_CODEC_H264_GUID;
init_cfg.presetGUID = NV_ENC_PRESET_P3_GUID;
init_cfg.encodeWidth = 1280;
init_cfg.encodeHeight = 720;
init_cfg.darWidth = 1280;
init_cfg.darHeight = 720;
init_cfg.frameRateNum = 25;
init_cfg.frameRateDen = 1;
init_cfg.enableEncodeAsync = 0;
init_cfg.enablePTD = 1;
init_cfg.reportSliceOffsets = 0;
init_cfg.enableSubFrameWrite = 0;
init_cfg.enableExternalMEHints = 0;
init_cfg.enableMEOnlyMode = 0;
init_cfg.enableWeightedPrediction = 0;
init_cfg.enableOutputInVidmem = 0;
init_cfg.reservedBitFields = 0;
init_cfg.privDataSize = 0;
init_cfg.privData = NULL;
init_cfg.maxEncodeWidth = 1280;
init_cfg.maxEncodeHeight = 720;
init_cfg.bufferFormat = NV_ENC_BUFFER_FORMAT_UNDEFINED;
init_cfg.tuningInfo = NV_ENC_TUNING_INFO_ULTRA_LOW_LATENCY;
/*
When we set the config like this the call to
`nvEncInitializeEncoder()` below works fine. But
when we set `init_cfg.encodeConfig` to NULL, we
run into an error with `NV_ENC_ERR_INVALID_DEVICE`
*/
init_cfg.encodeConfig = NULL; /* Using `NULL` results in `NV_ENC_ERR_INVALID_DEVICE` */
init_cfg.encodeConfig = &preset_cfg.presetCfg; /* this works */
status = ctx->funcs.nvEncInitializeEncoder(ctx->session, &init_cfg);
if (NV_ENC_SUCCESS != status) {
RXE("Failed to initialize the encoder: %s", nvenc_status_to_string(status));
r = -9;
goto error;
}
I’m testing this on Arch Linux:
65:00.0 VGA compatible controller: NVIDIA Corporation TU104 [GeForce RTX 2080 SUPER] (rev a1) (prog-if 00 [VGA controller])
Kernel driver in use: nvidia
Kernel modules: nouveau, nvidia_drm, nvidia
65:00.1 Audio device: NVIDIA Corporation TU104 HD Audio Controller (rev a1)
65:00.2 USB controller: NVIDIA Corporation TU104 USB 3.1 Host Controller (rev a1) (prog-if 30 [XHCI])
65:00.3 Serial bus controller: NVIDIA Corporation TU104 USB Type-C UCSI Controller (rev a1)
Kernel driver in use: nvidia-gpu
Kernel modules: i2c_nvidia_gpu
$ nvidia-smi
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 510.54 Driver Version: 510.54 CUDA Version: 11.6 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce ... Off | 00000000:65:00.0 On | N/A |
| 30% 30C P8 30W / 250W | 1791MiB / 8192MiB | 11% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
Is the documentation wrong? Is this expected behavior?
Thanks!