External Trigger works but camera cuts FPS in half

Hello everyone, I am having an issue with getting an external sync signal to trigger my IMX568 camera. I can confirm that the sync signal I am sending the camera is correct by using an oscilloscope. I measure 60Hz on the nose. When I run this Gstreamer pipeline:

gst-launch-1.0 nvarguscamerasrc sensor-id=1 saturation=0 aelock=true awblock=true num-buffers=300 ! 'video/x-raw(memory:NVMM), width=2472, height=2048, format=NV12' ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v

I see the following results:

/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 12, dropped: 0, current: 22.32, average: 22.32
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 23, dropped: 0, current: 20.02, average: 21.16
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 34, dropped: 0, current: 20.01, average: 20.77
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 45, dropped: 0, current: 19.99, average: 20.58
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 55, dropped: 0, current: 19.92, average: 20.45
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 66, dropped: 0, current: 20.09, average: 20.39
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 76, dropped: 0, current: 19.98, average: 20.34
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 87, dropped: 0, current: 20.01, average: 20.30
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 97, dropped: 0, current: 19.98, average: 20.26
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 108, dropped: 0, current: 20.02, average: 20.24
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 119, dropped: 0, current: 19.99, average: 20.21
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 130, dropped: 0, current: 19.99, average: 20.20
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 141, dropped: 0, current: 20.03, average: 20.18
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 151, dropped: 0, current: 19.99, average: 20.17
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 162, dropped: 0, current: 20.03, average: 20.16
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 172, dropped: 0, current: 19.97, average: 20.15
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 182, dropped: 0, current: 20.00, average: 20.14
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 192, dropped: 0, current: 20.00, average: 20.13
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 202, dropped: 0, current: 19.99, average: 20.13
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 213, dropped: 0, current: 20.02, average: 20.12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 223, dropped: 0, current: 19.79, average: 20.10
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 234, dropped: 0, current: 20.17, average: 20.11
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 245, dropped: 0, current: 20.03, average: 20.10
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 255, dropped: 0, current: 19.94, average: 20.10
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 266, dropped: 0, current: 20.05, average: 20.10
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 277, dropped: 0, current: 20.00, average: 20.09
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 287, dropped: 0, current: 19.97, average: 20.09
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 298, dropped: 0, current: 20.03, average: 20.09

Using this pipeline I am only getting 20FPS. If I add “framerate=60/1” to my caps I get about 30FPS.

I am not sure why I cannot get 60FPS when I in fact verified that the external sync is sending 60Hz on the nose. I am also testing this on the following hardware:

  • Jetson NX running Jetpack 4.5.1 and L4T 32.5.2
  • Auvidea JNX30M carrier board
    -Vision Components IMX568 monochrome camera
    -Vision Components MIPI repeater board
    -Proprietary Sync and Timecode Device (this is what I am using to generate the external sync)

I have also tried this using Nvidia’s dev kit board with no luck. I was able to get this working a few weeks ago on the Jetson dev kit using the same camera but running at 30Hz and at 24Hz. We need to use 4 lanes to achieve 60Hz and that is the reason for purchasing the Auvidea board.

I know for a fact the board and camera does 60Hz because I was able to achieve that framerate using the camera in free running mode (no trigger). Does anyone have any ideas what can be causing the camera to cut the FPS in half?

hello frank.delacruz,

are you able to achieve 60-fps with v4l utility? please refer to Applications Using V4L2 IOCTL Directly,
for example,
$ v4l2-ctl --set-fmt-video=width=2472,height=2048,pixelformat=RG10 --stream-mmap --stream-count=300 -d /dev/video0

Hello Jerry,

I have to add “trigger_mode=0” to the v4l2-ctl command so I can tell the camera to not use external trigger. This is the result without external trigger:

v4l2-ctl --set-fmt-video=width=2472,height=2048,pixelformat=RG10 --stream-mmap --stream-count=300 -d /dev/video0 -c trigger_mode=0
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 82.00 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 80.50 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 79.66 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

When I use external trigger, “trigger_mode=1” this is the result:

v4l2-ctl --set-fmt-video=width=2472,height=2048,pixelformat=RG10 --stream-mmap --stream-count=300 -d /dev/video0 -c trigger_mode=1
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 33.00 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 31.52 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 31.02 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.76 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.61 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.51 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.44 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.38 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.34 fps
<<<<<<<<<<<<<<<<<<<<<<<<<

I am feeding the camera 60Hz via an external sync source. I can confirm that the sync source is running at 60Hz via an oscilloscope.

UPDATE

I did not notice that my exposure was set too slow, about 1mS. My exposure needs to be set between 1uS-314uS in order for it to be able to do 60FPS. So I used the following v4l2-ctl command to change it:

v4l2-ctl --set-fmt-video=width=2472,height=2048,pixelformat=RG10 --stream-mmap --stream-count=300 -d /dev/video1 -c trigger_mode=1 -c exposure=100
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 62.37 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 61.19 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 60.79 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 60.59 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

As you can see I am now getting 60fps. I am still running into an issue using Gstreamer to actually record at 60FPS. v4l2-ctl does not seem to save the camera settings when I set them. Fpsdisplaysink seems to work just fine:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 saturation=0 aelock=true ! 'video/x-raw(memory:NVMM), width=2472, height=2048, format=NV12' ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=1 -v 
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2048, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2048, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2048, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2048, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2048, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2472, height=(int)2048, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 2472 x 2048 FR = 78.800002 fps Duration = 12690355 ; Analog Gain range min 0.000000, max 48.000000; Exposure Range min 1000, max 314000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 0 
   Output Stream W = 2472 H = 2048 
   seconds to Run    = 0 
   Frame Rate = 78.800002 
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 32, dropped: 0, current: 63.22, average: 63.22
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 62, dropped: 0, current: 59.48, average: 61.35
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 93, dropped: 0, current: 60.46, average: 61.05
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 123, dropped: 0, current: 59.93, average: 60.77
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 153, dropped: 0, current: 60.00, average: 60.62
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 184, dropped: 0, current: 59.63, average: 60.45
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 214, dropped: 0, current: 59.95, average: 60.38
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 245, dropped: 0, current: 60.08, average: 60.34

What does not work is when I try and use nvjpegenc using filesink:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 saturation=0 aelock=true ! 'video/x-raw(memory:NVMM), width=2472, height=2048, format=NV12' ! nvjpegenc ! qtmux name=mux ! filesink location=test_0.mov -e

As you can see the video did not record at 60FPS. Is there anything I have to do to my Gstreamer pipeline to optimize it to work? or do I need to do something else with v4l2-ctl?

hello frank.delacruz,

please have a try to specify framerate into the pipeline, you may also try h265enc to record the video stream,
for example,
$ gst-launch-1.0 -e nvarguscamerasrc sensor-id=0 saturation=0 aelock=true num-buffers=300 ! 'video/x-raw(memory:NVMM), width=2472, height=2048, format=NV12, framerate=60/1' ! tee name=streams streams. ! queue ! nvv4l2h265enc bitrate=8000000 ! h265parse ! qtmux ! filesink location=video0.mp4 streams. ! queue ! nvoverlaysink -e

Hey Jerry,

I have tried using your pipeline but all I get is a red screen. I changed nvoverlaysink to use nveglglessink instead in order for it to work. Unfortunately, it produced the same result.

I did notice something interesting. I opened a live view of the camera using the following pipeline:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 saturation=0 aelock=true awblock=true exposuretimerange="1000 1000" ! 'video/x-raw(memory:NVMM), width=2472, height=2048, format=NV12, framerate=60/1' ! nvegltransform ! nveglglessink -e

I get the following stream:

As you can see the image did not take my exposure value of 100uS from the v4l2-ctl command. I then sent the v4l2-ctl command via a second terminal window:

v4l2-ctl -d /dev/video0 -c exposure=100 -c black_level=100 -c gain=15

The stream then instantly changed to what 100uS is supposed to look like.

Don’t mind the dark image. We will be using the flash our of the camera to strobe LEDs to brighten up the image. But you can see that after applying the v4l2-ctl command from a second window that the camera has accepted the new settings. It does not seem to save the camera settings if I apply them before launching Gstreamer. Could this be a device tree issue?

hello frank.delacruz,

please examine the mode settings in sensor device tree, you may check default_exp_time, which by default configure as 33ms.
you may see-also developer guide, Sensor Software Driver Programming for reference, thanks

Hello Jerry!

Good news! I checked out the Sensor Software Driver Programming link you sent and noticed that the exposure values are in uS. I also noticed that there is another variable named exposure_factor and this was set to 1000000. I changed it to 1 and I think that might have done the trick.

I then re-tested my Gstreamer pipeline:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 saturation=0 aelock=true ! 'video/x-raw(memory:NVMM), width=2472, height=2048, format=NV12' ! nvjpegenc ! qtmux name=mux ! filesink location=test_0.mov -e

and I was able to achieve 60FPS using my external trigger.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.