DeepStream: incorrect camera parameters provided, please provide supported resolution and frame rate

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Jetson Nano
• DeepStream Version
DeepStream5.0
• JetPack Version (valid for Jetson only)
Jetpack4.4
• TensorRT Version
TensorRT Version: 7.1
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
Bugs
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
In directory:
/opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app

Type command:
deepstream-app -c source1_usb_dec_infer_resnet_int8.txt
Error message:

Using winsys: x11 
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/../../models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine open error
0:00:05.022825538 22685     0x3c397810 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/../../models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine failed
0:00:05.023455236 22685     0x3c397810 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/../../models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine failed, try rebuild
0:00:05.023738522 22685     0x3c397810 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
WARNING: INT8 not supported by platform. Trying FP16 mode.
INFO: [TRT]: Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output.
INFO: [TRT]: Detected 1 inputs and 2 output network tensors.
ERROR: Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-5.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_fp16.engine opened error
0:00:52.507089628 22685     0x3c397810 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1743> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-5.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_fp16.engine
INFO: [Implicit Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1         3x368x640       
1   OUTPUT kFLOAT conv2d_bbox     16x23x40        
2   OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40         

0:00:53.022514160 22685     0x3c397810 INFO                 nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/config_infer_primary.txt sucessfully

Runtime commands:
	h: Print this help
	q: Quit

	p: Pause
	r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
      To go back to the tiled display, right-click anywhere on the window.


**PERF:  FPS 0 (Avg)	
**PERF:  0.00 (0.00)	
** INFO: <bus_callback:181>: Pipeline ready

** INFO: <bus_callback:167>: Pipeline running

ERROR from src_elem: Internal data stream error.
Debug info: gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstV4l2Src:src_elem:
streaming stopped, reason not-negotiated (-4)
** INFO: <bus_callback:147>: incorrect camera parameters provided, please provide supported resolution and frame rate

Quitting
App run failed

$ deepstream-app --version-all
deepstream-app version 5.0.0
DeepStreamSDK 5.0.0
CUDA Driver Version: 10.2
CUDA Runtime Version: 10.2
TensorRT Version: 7.1
cuDNN Version: 8.0
libNVWarp360 Version: 2.0.1d3

Camera info (Exst file in /dev/video0):

v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
	Index       : 0
	Type        : Video Capture
	Pixel Format: 'YV12'
	Name        : Planar YVU 4:2:0
		Size: Discrete 640x480
			Interval: Discrete 0.033s (30.000 fps)

source1_usb_dec_infer_resnet_int8.txt file:

cat source1_usb_dec_infer_resnet_int8.txt 
################################################################################
# Copyright (c) 2018-2020, NVIDIA CORPORATION. All rights reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.  IN NO EVENT SHALL
# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
################################################################################

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=1
camera-width=640
camera-height=480
camera-fps-n=20
camera-fps-d=1
camera-v4l2-dev-node=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming 5=Overlay
type=2
sync=0
display-id=0
offset-x=0
offset-y=0
width=0
height=0
overlay-id=1
source-id=0

[sink1]
enable=0
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265 3=mpeg4
codec=1
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=2000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=0
output-file=out.mp4
source-id=0

[sink2]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming 5=Overlay
type=4
#1=h264 2=h265
codec=1
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=4000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=0
# set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400


[osd]
enable=1
border-width=2
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0

[streammux]
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1280
height=720
## If set to TRUE, system timestamp will be attached as ntp timestamp
## If set to FALSE, ntp timestamp from rtspsrc, if available, will be attached
# attach-sys-ts-as-ntp=1

# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
model-engine-file=../../models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine
#Required to display the PGIE labels, should be added even when using config-file
#property
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
#Required by the app for SGIE, when used along with config-file property
gie-unique-id=1
config-file=config_infer_primary.txt

[tests]
file-loop=0

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Thank you!

For addition, I can run this pipeline (I found from Internet)
gst-launch-1.0 v4l2src device="/dev/video0" name=e ! 'video/x-raw, width=640, height=480' ! videoconvert ! 'video/x-raw, width=640, height=480, format=(string)YUY2' ! xvimagesink
and popup a window showing the camera image

Also, I found something looks related (How to change source pixel format from YUYV to MJPG - #2 by DaneLLL). I tried the pipeline

$gst-launch-1.0 v4l2src ! videoconvert ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! xvimagesink 
WARNING: erroneous pipeline: could not link nvvideoconvert0 to xvimagesink0, xvimagesink0 can't handle caps video/x-raw(memory:NVMM), format=(string)NV12

If I delete the (memory:NVMM) as gst-launch-1.0 v4l2src ! videoconvert ! nvvideoconvert ! 'video/x-raw,format=NV12' ! xvimagesink, I got error:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...

(gst-launch-1.0:24714): GStreamer-CRITICAL **: 10:31:54.621: gst_mini_object_copy: assertion 'mini_object != NULL' failed

(gst-launch-1.0:24714): GStreamer-CRITICAL **: 10:31:54.621: gst_caps_get_structure: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:24714): GStreamer-CRITICAL **: 10:31:54.621: gst_structure_copy: assertion 'structure != NULL' failed

(gst-launch-1.0:24714): GStreamer-CRITICAL **: 10:31:54.621: gst_caps_append_structure_full: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:24714): GStreamer-CRITICAL **: 10:31:54.621: gst_caps_get_structure: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:24714): GStreamer-CRITICAL **: 10:31:54.621: gst_structure_copy: assertion 'structure != NULL' failed

(gst-launch-1.0:24714): GStreamer-CRITICAL **: 10:31:54.621: gst_caps_append_structure_full: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:24714): GStreamer-CRITICAL **: 10:31:54.621: gst_mini_object_unref: assertion 'mini_object != NULL' failed

(gst-launch-1.0:24714): GStreamer-CRITICAL **: 10:31:54.621: gst_mini_object_ref: assertion 'mini_object != NULL' failed
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.003078642
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

If I change the NV12 to YUY2:

gst-launch-1.0 v4l2src ! videoconvert ! nvvideoconvert ! 'video/x-raw,format=YUY2' ! xvimagesink 

Window popup again to show the camera frame.

To be honest, I do not know much about the meaning of the parameter I typed above about the pipeline as I just copy and try with anything I saw… I hope this can also help to solve the problem.

Thxx a lot!!

The source code is in /opt/nvidia/deepstream/deepstream/sources/apps

DeepStream SDK use nvegltransform and nveglglessink to display the GPU processed data. DeepStream SDK FAQ - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums

The deepstream-app pipeline for v4l2src has no “videoconvert” for Jetson. Please refer to the source code.

You camera only support 30FPS framerate, why do you set 20FPS in the deepstream-app config file?

You may check whether the following pipeline works or not with the camera.

gst-launch-1.0 v4l2src ! 'video/x-raw, framerate=30/1',width=640,height=480' ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=RGBA' ! nvegltransform ! nveglglessink

Thx @Fiona.Chen .
I tried your provide pipeline but fail:

$ gst-launch-1.0 v4l2src ! 'video/x-raw, framerate=30/1,width=640,height=480' ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=RGBA' ! nvegltransform ! nveglglessink
Setting pipeline to PAUSED ...

Using winsys: x11 
Pipeline is live and does not need PREROLL ...
Got context from element 'eglglessink0': gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Setting pipeline to PLAYING ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.000153753
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

I have tried both 20 and 30 FPS in deepstream-app config before and both return same error.

The pipeline I listed is the same as deepstream-app, so this is why deepstream-app fail.

The source code is in /opt/nvidia/deepstream/deepstream/sources/apps
The deepstream-app pipeline for v4l2src has no “videoconvert” for Jetson. You may need to add ‘videoconvert’ before nvvideoconvert to make the pipeline work.

Thxx @Fiona.Chen ,
I tired new pipeline with adding videoconvert as

gst-launch-1.0 v4l2src ! 'video/x-raw, framerate=30/1,width=640,height=480' ! videoconvert ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! videoconvert ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=RGBA' ! nvegltransform ! nveglglessink

Window popup and I can read my camera.

Within the source code /opt/nvidia/deepstream/deepstream/sources/apps/apps-common/src/deepstream_source_bin.c
, I do not have much idea where I can add videoconvert… I cannot find the exact nvvideoconvert within the code but I can find 1 videoconvert in line 129:

#ifndef IS_TEGRA
    GstElement *nvvidconv1;
    nvvidconv1 = gst_element_factory_make ("videoconvert", "nvvidconv1");
    if (!nvvidconv1) {
      NVGSTDS_ERR_MSG_V ("Failed to create 'nvvidconv1'");
      goto done;
    }
#endif

Can you suggest which line I should modify?
Again, thank you very much for your kindly help!

I don’t think it is a good idea to use the second videoconvert

If I’m correct your pipeline is moving frame from CPU to GPU using nvvideoconvert but your second videoconvert is moving it back to CPU and then since you’re doing nvvideoconvert it is moving back to GPU.

Please remove the second videoconvert

gst-launch-1.0 v4l2src ! ‘video/x-raw, framerate=30/1,width=640,height=480’ ! videoconvert ! nvvideoconvert ! ‘video/x-raw(memory:NVMM),format=NV12’ ! nvvideoconvert ! ‘video/x-raw(memory:NVMM),format=RGBA’ ! nvegltransform ! nveglglessink

I can show the window after removing the second videoconvert.

However, back to the deepstream-app, which line should I modify for /opt/nvidia/deepstream/deepstream/sources/apps/apps-common/src/deepstream_source_bin.c? I still do not have any idea where to add the videoconvert in this file…
Thxx all of your help!

Hi all,

To add videoconvert to the source code, I just need to modify the file : /opt/nvidia/deepstream/deepstream/sources/apps/apps-common/src/deepstream_source_bin.c ? Or any other file I should modify?
Also, where should I add it in the file? Since I cannot find any line like gst-launch-1.0 v4l2src ! ‘video/x-raw, framerate=30/1,width=640,height=480’ ! videoconvert ! nvvideoconvert !.
Thxxx a lot for all of your help!

Do you have any experience of gstreamer coding? If not, please learn gstreamer basic knowledge and coding skills first. GStreamer: open source multimedia framework