Inference using camera with deepstream

Hello everyone,

I’m using deepstream docker container and I want to Inference one of the TAO purpose-built (pretrained) models.
so, I run deepstream with the following command:

docker run --device /dev/video0 --gpus ‘"‘device=0’"’ -it --rm -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=$DISPLAY -w /opt/nvidia/deepstream/deepstream-6.0

and we enter the container, then I go to the following directory:

cd /samples/configs/tao_pretrained_models

after following the README file step by step , I run the following command to get inference for sanity check:

deepstream-app -c deepstream_app_source1_dashcamnet_vehiclemakenet_vehicletypenet.txt

it runs the model on the sample video file, everything is fine till here,
the problem is that I want to try this model (or any model in general) with my camera/webcam
I changed the config file (deepstream_app_source1_dashcamnet_vehiclemakenet_vehicletypenet.txt) “[sources0]” as follows:

#Type - 1=CameraV4L2 2=URI 3=MultiURI

but I get this error:

after running this command

v4l2-ctl --list-formats-ext

this is the result:

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

• Hardware Platform (Jetson Nano)
• DeepStream Version: 6.0
• JetPack Version: 4.6
• TensorRT Version: 8.0.1

configs.txt (2.2 KB)
primary_configs.txt (929 Bytes)

Seems can’t support those video format which your camera output.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.