外接usb3.0 camera编码延迟较大,2560*1920一帧编码延迟为50ms左右

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Jetson
• DeepStream Version
6.2
• JetPack Version (valid for Jetson only)
5.1.1
• TensorRT Version
8.5.2.2
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)


如上,为延迟的截图。
如何修改,可以降低一帧的编码延迟

please use English. Thanks! the forum is public to the world.
which sample are you testing or referring to? if using custom code, what is the whole media pipeline?
how did you measure the latency? please refer to this faq for how to do latency measurement.

I am using the deepstream-app program to stream from a USB 3.0 camera, utilizing Hardware H264 Baseline encoding. After that, I am using nvrtspoutsinkbin to send the encoded data to an RTSP server. My configuration for latency measurement is as follows:
export NVDS_ENABLE_LATENCY_MEASUREMENT=1 in the env.
[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

And I found that when I use the v4l2-ctl command, the output as follow:

[work]$ v4l2-ctl --device=/dev/video0 --set-fmt-video=width=2560,height=1920,pixelformat=YUYV --stream-mmap --stream-count=300
<<<<<<<<<< 14.94 fps, dropped buffers: 6
<<<<<<<<<<<< 14.94 fps, dropped buffers: 3
<<<<<<<<< 14.95 fps, dropped buffers: 7
<<<<<<<<<< 14.95 fps, dropped buffers: 4

  1. please refer to this faq How to connect a USB camera in DeepStream . can the v4l2src output video stably?
  2. could you share the deepstream-app configuration file?
  3. to narrow down this issue, what is the Encode Latency and encoder utilization of this pipeline? you can use “sudo tegrastats” to get encoder utilization.
    gst-launch-1.0 -v videotestsrc ! video/x-raw,format=YUY2,width=2560,height=1920 ! videoconvert ! nvvideoconvert ! nvv4l2h264enc ! fakesink

Sorry for the late reply. Right now, I just want to read raw data from the USB 3.0 camera on the ORIN NX. The current power setting is set to MAXN, and I need to execute the sudo jetson_clocks command to avoid frame drops. The log for reading frames is as follows:

unix_ai@nvidia−desktopsudo jetson_clocks

unix_ai@nvidia−desktop: v4l2-ctl --device=/dev/video0 --set-fmt-video=width=2560,height=1920,pixelformat=YUYV --stream-mmap --stream-count=300
<<<<<<<<<<<<<<<< 14.94 fps
<<<<<<<<<<<<<<< 14.94 fps
<<<<<<<<<<<<<<< 14.94 fps
<<<<<<<<<<<<<<< 14.94 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<<< 14.95 fps
<<<<<<<<<<<<<< 14.95 fps

unix_ai@nvidia-desktop:~$
Is this normal?

Below is my configuration file:

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=0
rows=1
columns=1
width=640
height=480

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=1
camera-width=2560
camera-height=1920
camera-fps-n=15
camera-fps-d=1
camera-v4l2-dev-node=0
#video-format=GRAY8

[source10]
enable=0
#Type - 8=alsa
type=8
alsa-device=hw:0,0
num-sources=1

[sink0]
enable=0
#Type - 1=FakeSink 2=EglSink/nv3dsink(Jetson only) 3=File 4=RTSPStreaming 5=nvdrmvideosink
type=2
sync=0
conn-id=0
width=0
height=0
plane-id=1
source-id=0

[sink2]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming 5=nvdrmvideosink
type=4
#1=h264 2=h265
codec=1
encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=1600000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=0
iframeinterval=15

set below properties in case of RTSPStreaming

rtsp-port=9555
udp-port=19407

[osd]
enable=0
border-width=2
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=240
clock-y-offset=240
clock-text-size=12
clock-color=1;0;0;0
display-text=1

[streammux]
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=10000
#set muxer output width and height
width=2560
height=1920
##If set to TRUE, system timestamp will be attached as ntp timestamp
##If set to FALSE, ntp timestamp from rtspsrc, if available, will be attached
attach-sys-ts-as-ntp=0

#config-file property is mandatory for any gie section.
#Other properties are optional and if set will override the properties set in

the infer config file.

[tests]
file-loop=0

Below is the log:

*** DeepStream: Launched RTSP Streaming at rtsp://localhost:9555/ds-test ***

Opening in BLOCKING MODE
0:00:00.175439306 4425 0xaaaafaf9b100 WARN v4l2 gstv4l2object.c:2398:gst_v4l2_object_add_interlace_mode:0xaaaafaf8c040 Failed to determine interlace mode
0:00:00.175484778 4425 0xaaaafaf9b100 WARN v4l2 gstv4l2object.c:2398:gst_v4l2_object_add_interlace_mode:0xaaaafaf8c040 Failed to determine interlace mode
0:00:00.175502954 4425 0xaaaafaf9b100 WARN v4l2 gstv4l2object.c:2398:gst_v4l2_object_add_interlace_mode:0xaaaafaf8c040 Failed to determine interlace mode
0:00:00.175517738 4425 0xaaaafaf9b100 WARN v4l2 gstv4l2object.c:2398:gst_v4l2_object_add_interlace_mode:0xaaaafaf8c040 Failed to determine interlace mode
0:00:00.175570411 4425 0xaaaafaf9b100 WARN v4l2 gstv4l2object.c:4512:gst_v4l2_object_probe_caps:<sink_sub_bin_encoder1:src> Failed to probe pixel aspect ratio with VIDIOC_CROPCAP: Unknown error -1
0:00:00.254116113 4425 0xaaaafaf9b100 WARN v4l2src gstv4l2src.c:695:gst_v4l2src_query:<src_elem> Can’t give latency since framerate isn’t fixated !

Runtime commands:
h: Print this help
q: Quit

    p: Pause
    r: Resume

** INFO: <bus_callback:239>: Pipeline ready

** INFO: <bus_callback:225>: Pipeline running

NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
0:00:00.299559415 4425 0xaaaafaf9c1e0 WARN v4l2bufferpool gstv4l2bufferpool.c:1114:gst_v4l2_buffer_pool_start:<sink_sub_bin_encoder1:pool:src> Uncertain or not enough buffers, enabling copy threshold
0:00:00.346550571 4425 0xaaaafaf9c360 WARN v4l2bufferpool gstv4l2bufferpool.c:809:gst_v4l2_buffer_pool_start:<src_elem:pool:src> Uncertain or not enough buffers, enabling copy threshold
0:00:01.217993739 4425 0xaaaafaf9c460 WARN nvstreammux_ntp gstnvstreammux_ntp.cpp:108:check_if_sys_rtcp_time_is_ntp_sync:<src_bin_muxer> warning: Either host or Source 0 seems to be out of NTP sync SYS TIME = 2024-08-16T09:23:38.971Z CALCULATED NTP TIME = 1970-01-01T00:00:00.000Z
WARNING from src_bin_muxer: Either host or Source 0 seems to be out of NTP sync SYS TIME = 2024-08-16T09:23:38.971Z CALCULATED NTP TIME = 1970-01-01T00:00:00.000Z
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvmultistream2/gstnvstreammux_ntp.cpp(108): check_if_sys_rtcp_time_is_ntp_sync (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstNvStreamMux:src_bin_muxer
H264: Profile = 66, Level = 0
NVMEDIA: Need to set EMC bandwidth : 1436000
NVMEDIA_ENC: bBlitMode is set to TRUE
0:00:01.292414570 4425 0xffff240132a0 WARN v4l2bufferpool gstv4l2bufferpool.c:1565:gst_v4l2_buffer_pool_dqbuf:<sink_sub_bin_encoder1:pool:src> Driver should never set v4l2_buffer.field to ANY
Encode Latency = 31.042969

BATCH-NUM = 0**
Batch meta not found for buffer 0xffff18003b40
Number of sources in batch = 0

BATCH-NUM = 1**
Batch meta not found for buffer 0xffff20032d80
Number of sources in batch = 0
Encode Latency = 42.096924
Encode Latency = 17.205078
Encode Latency = 18.483154
Encode Latency = 19.746826
Encode Latency = 20.439209
Encode Latency = 21.261963
Encode Latency = 22.332031
Encode Latency = 23.270020
Encode Latency = 24.483154
Encode Latency = 25.879883
Encode Latency = 27.155029

Encode Latency = 28.602051
Encode Latency = 30.265869
Encode Latency = 32.333984
Encode Latency = 38.183105
Encode Latency = 36.671143
Encode Latency = 40.893066
Encode Latency = 43.392090
Encode Latency = 46.370850
Encode Latency = 45.782959
Encode Latency = 45.770996
Encode Latency = 45.906982
Encode Latency = 45.805908
Encode Latency = 45.755127
Encode Latency = 45.762939
Encode Latency = 45.766846
Encode Latency = 45.773926
Encode Latency = 45.758057
Encode Latency = 45.784912
Encode Latency = 50.416992

BATCH-NUM = 2**
Batch meta not found for buffer 0xffff18003000
Number of sources in batch = 0

BATCH-NUM = 3**
Batch meta not found for buffer 0xffff18022900
Number of sources in batch = 0
Encode Latency = 45.828125
Encode Latency = 45.942139
Encode Latency = 45.903076
Encode Latency = 45.856934
Encode Latency = 45.754883
Encode Latency = 45.779053

To narrow down this issue, could you do some tests?

  1. could you share the 1.log by the following cmd? wondering the camera 's formats.
v4l2-ctl -d /dev/video0 --list-formats-ext >1.log
  1. deepstream-app uses v4l2src plugin to capture. please use gst-launch to check if the fps is stable first. please share 2.log by following cmd.
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=YUYV,width=2560,height=1920,framerate=15/1 ! fpsdisplaysink >2.log
  1. if the fps is table, need to check if it is performance issue. please execute “sudo tegrastats>3.log” , then execute the following cmd. what is the “Encode Latency” value? and please share the 3.log, which is the performance log.
export NVDS_ENABLE_LATENCY_MEASUREMENT=1 && gst-launch-1.0 -v videotestsrc ! video/x-raw,format=YUY2,width=2560,height=1920,framerate=15/1 ! videoconvert ! nvvideoconvert ! nvv4l2h264enc ! fakesink

do you need to encode 2560x1920 video? you can lower the resolution in [streammux].