Nvv4l2decoer wsl2 custom image

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
GPU
• DeepStream Version
7.0
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
555.99
• Issue Type( questions, new requirements, bugs)
questions, bugs

Hello,

As of DS 7.0, we’re able to use the nvv4l2decoder element in WSL2, where we would previously get an error. This is working as intended when using one of the deepstream images from NGC and is a great addition.

I am unable to get the element to work in other containers based on the cuda docker images even though Deepstream and all dependencies are correctly installed and working as expected. Can you elaborate on how to get the nvv4l2decoder to work in custom built containers on WSL2?

Error getting capabilities for device ‘/dev/nvidia0’: It isn’t a v4l2 driver. Check if it is a v4l1 driver.

Thank you!

/M

1.Which image is your custom Docker image based on? You can refer to the following project

2.Execute the install.sh script at /opt/nvidia/deepstream/deepstream/

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.