You have configured NVENC-accelerated encoding, but your device doesn't support NVENC for codec 'h264'

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) RTX 4060 Laptop GPU
• DeepStream Version 6.4
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only) 535, but have tried with 550 & 560
• Issue Type( questions, new requirements, bugs) Bugs
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
I am using an Ubuntu 22.04 Hyper-V VM (hosted on Windows 11 home) to run a docker container which uses the Deepstream libraries. First, I passed my GPU to the VM using Microsoft’s Dxgkrnl module. This enabled nvidia-smi to work on my VM. Then, I followed the steps in this doc (Configure The Runtime Environment — Savant 0.4.0 documentation). However, when I run the docker compose command, I see this error message:

always-on-sink-1  | RuntimeError: Received error "gst-resource-error-quark: 
Could not open device '/dev/v4l2-nvenc' for reading and writing. (7)" 
from nvv4l2h264enc0. Debug info: "v4l2_calls.c(713): gst_v4l2_open (): 
/GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0:

always-on-sink-1  | system error: No such file or directory".

always-on-sink-1  |  ERROR insight::savant::deepstream::encoding          
> You have configured NVENC-accelerated encoding, but your 
device doesn't support NVENC for codec 'h264'.

I tried different driver versions (nvidia-driver-535,550,560). Also, I found this github link ( Standalone NVENC encoder (0.21.0+) broke NVENC on Hyper-V with GPU passthrough · Issue #2141 · LizardByte/Sunshine · GitHub), suggesting that there may be some GeForce limitations in VM, specifically regarding NVENC library. However, that issue is only for windows to windows passthrough, not windows to Ubuntu. Further, they are using studio drivers, not proprietary drivers (like nvidia-driver-535).

How do I get NVENC to work inside the docker container? Is there a specific Nvidia driver version that fixes the issue? Do I need to install cuda toolkit? If so, do I install the Ubuntu or Ubuntu-WSL version?

I would like to avoid dual-booting Windows and Ubuntu due to security concerns. Also, since WSL is not yet supported on this project, I cannot use it.

DeepStream7.0 already supported WSL. please refer to the doc.

The Savant project doesn’t support Deepstream 7.0 yet. It will not support WSL, even when it releases support for Deepstream 7.0. The latest version of Savant supports Deepstream 6.4. Furthermore, I am using native Ubuntu in my Hyper-V virtual machine, not WSL.

what did this step do? if you want to run deepstream docker, please execute the steps in the deepstream docker doc ; if you want run deepstream without docker, please refer to the guide.

I followed the setup steps in this link - (Configure The Runtime Environment — Savant 0.4.0 documentation).

From the steps in the link above, I ran this command: sudo docker compose -f samples/opencv_cuda_bg_remover_mog2/docker-compose.x86.yml up.

Here is the link to the docker compose file: Savant/samples/opencv_cuda_bg_remover_mog2/docker-compose.x86.yml at bwsw-patch-1 · insight-platform/Savant (github.com).

Savant is an abstraction over deepstream, but the error above is coming from deepstream and not Savant. I’ve tried mounting directories and passing my gpu device (/dev/dxg) to the docker container, but they didn’t seem to work. Which Ubuntu Nvidia driver has a working NVENC library?

could you share the whole log? which code or command-line will cause this error? could you share the log of “/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libgstnvvideo4linux2.so”? which will provide nvv4l2h264enc plugin.
why need to find /dev/v4l2-nvenc on dgpu? on dgpu docker container, I also can’t find /dev/v4l2-nvenc, but nvv4l2h264enc can work.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.