Jetson Nano FullHD Camera 60 fps more latency than 30 fps

Hi there,

We developed a driver for a FullHD (1920*1080) Bayer Sensor. Maximum framerate of the Sensor is 120 fps. We are using the ISP of the Jetson and stream over Gstreamer using the nvarguscamerasrc. Everything is working fine but:

We did some latency measurement by filming an LED and the videoscreen with a 240 fps camera. We measure the latency by counting frames between the moment the LED glows up in “real” and on the screen.

With 30 fps we get a latency of ~ 60 ms.
With 60 fps we get a latency of ~ 115 ms.

Why does the latency increase with fps? this should not be the case?

gstreamer commands:
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1' ! nvoverlaysink sync=false async=false -e

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=1920, height=1080, framerate=60/1' ! nvoverlaysink sync=false async=false -e

hello reifenrath.michel,

may I know both your 30-fps and 60-fps sensor-modes were at the same resolution, i.e. 1920x1080?
you may also check the messages reported from nvarguscamerasrc, it’ll indicate which sensor mode it choose for sensor streaming.
thanks

Yes it’s the exactly same mode. Just fps is changed. We can observe that the right settings are written to the sensor from prints from the driver. It seems that it’s something within the isp oder gstreamer. Is the Jetson nano not capable of that high Datarates?

hello reifenrath.michel,

let’s narrow down the issue,
please using below commands to disable preview frames and shows camera output frame-rate only,
for example,

$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v

since you’re rendering camera frame to display monitor for checking,
may I also know what’s the refresh rate of your display monitor?
thanks

I work for the same company. Output is:

30fps:

/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 459, dropped: 0, current: 30,00, average: 30,11
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 475, dropped: 0, current: 29,99, average: 30,11
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 491, dropped: 0, current: 30,02, average: 30,10
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 507, dropped: 0, current: 29,98, average: 30,10
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 523, dropped: 0, current: 30,01, average: 30,10
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 539, dropped: 0, current: 30,01, average: 30,09

60fps:

/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 578, dropped: 0, current: 59,59, average: 60,16
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 609, dropped: 0, current: 60,29, average: 60,17
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 640, dropped: 0, current: 60,39, average: 60,18
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 670, dropped: 0, current: 59,78, average: 60,16
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 700, dropped: 0, current: 59,57, average: 60,14
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 731, dropped: 0, current: 60,60, average: 60,16
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 761, dropped: 0, current: 59,63, average: 60,13
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 791, dropped: 0, current: 59,75, average: 60,12

90fps:

/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 641, dropped: 0, current: 90,08, average: 90,32
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 686, dropped: 0, current: 90,00, average: 90,30
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 732, dropped: 0, current: 90,11, average: 90,29
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 778, dropped: 0, current: 91,12, average: 90,34
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 823, dropped: 0, current: 89,79, average: 90,31
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 868, dropped: 0, current: 89,30, average: 90,26

Our display monitor refresh rate is 60Hz.

Best regards,
jb

hello busch.johannes,

please specify format types into gstreamer pipeline and gather the results.
for example,

$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvoverlaysink -ev

30fps:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12’ ! nvoverlaysink -ev

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstNvOverlaySink-nvoverlaysink:nvoverlaysink-nvoverlaysink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected…
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 1920 x 1080 FR = 90,000001 fps Duration = 11111111 ; Analog Gain range min 1,000000, max 15,937500; Exposure Range min 100000, max 1000000000;

GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 0
Output Stream W = 1920 H = 1080
seconds to Run = 0
Frame Rate = 90,000001
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.

and for 60fps:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=60/1, format=NV12’ ! nvoverlaysink -ev

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)60/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)60/1
/GstPipeline:pipeline0/GstNvOverlaySink-nvoverlaysink:nvoverlaysink-nvoverlaysink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)60/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)60/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected…
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 1920 x 1080 FR = 90,000001 fps Duration = 11111111 ; Analog Gain range min 1,000000, max 15,937500; Exposure Range min 100000, max 1000000000;

GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 0
Output Stream W = 1920 H = 1080
seconds to Run = 0
Frame Rate = 90,000001
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
^Chandling interrupt.
Interrupt: Stopping pipeline …
EOS on shutdown enabled – Forcing EOS on the pipeline
Waiting for EOS…
Got EOS from element “pipeline0”.
EOS received - stopping pipeline…
Execution ended after 0:00:08.017939097
Setting pipeline to PAUSED …
Setting pipeline to READY …
GST_ARGUS: Cleaning up
CONSUMER: Done Success
GST_ARGUS: Done Success
Setting pipeline to NULL …
Freeing pipeline …

hello busch.johannes,

please evaluate capture to display latency with above gst commands,
thanks

Hi busch.johannes,

Aby update? Is this still an issue to support?

We will test it within the next days. sorry for the delay but we have some other stuff to finish.

Hi,

i did the test and with the above mentioned commands we get the same results:
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, Framerate=30/1, format=NV12’ ! nvoverlaysink -ev :
56,3 ms

and

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, Framerate=60/1, format=NV12’ ! nvoverlaysink -ev :

114,6 ms

so its pretty near to our first tests

hello reifenrath.michel,

could you please also have profiling with MMAPI samples.
you may profiling glass-to-glass latency with 09_camera_jpeg_capture,
for example, there’re options you may enabled for evaluation.

$ ~/tegra_multimedia_api/samples/09_camera_jpeg_capture -s --cap-time 60 --fps 60 --disable-jpg --sensor-mode 2 --pre-res 1280x720

please put your Nano platform enter performance mode for testing,
you might also check similar discussion thread, Topic 55327 for reference.
thanks