Lost frame and metadata using nvarguscamerasrc

We are using a Sony mipi sensor (iMX568) with a Jetson Nano board (JP 4.6 LT 32.6.1) using GStreamer to provide the sensor frames to our (C++) application using a GStreamer appsink element.
I have modified the nvarguscamerasrc source in order to pass additional metadata to our application:
#1 ArgusLib metadata via IcaptureMetadata
#2 sensor metadata (embedded data line data) via ISensorPrivateMetadata
in the GStreamer pipeline via the gstreamer qdata method.

All this works fine except
Issue #1
I can detect that sensor frames are being dropped
Issue #2
I can detect that we can loose the synchronistation between the frame buffer data and the frame sensor metadata
i.e. the frame “image” is for frame “N” but the “sensor meta data” is for frame “N+1”.

The later issue is a disaster for our application.
Please note - I am detecting these issues in nvarguscamerasrc StreamConsumer::threadExecute() i.e. at the very front of the GStreamer pipeline - the very first place that I have access to in source code. (I am not detecting them after the appsink element in my application i.e. this is not ocurring in the GStreamer pipeline).

I can tell that the frames are not being dropped by the sensor but are being dropped somewhere in the chain between the ISP / nvargus-daemon / ArgusLib / nvarguscamerasrc. I believe that the frames are at least being received into the ISP because when a frame is dropped the Argus frame count (obtained via iFrame->getNumber()) jumps by 2 in the subsequent frame.

I have found that this issue is much worse if I use “busy” real time threads in my application (i.e. by using threads created with pthread_create() and with a thread priority greater than 0).
I managed to improve things by using schedtool to start the nvargus-daemon with a higher priority (I used 85) but the problem persists.

I have now reverted to only using threads with priority 0 - the frequency of these issues has now reduced but they still occur at a frequency that makes our application almost unuseable.

Have you come accross issues like this before with nvargus-daemon/AgusLib/nvarguscamerasrc ??

Are there any work arounds ??

Many thanks in advance,
nvarguscamerasrc.zip (39.9 KB)

hello chris.wallett,

is it related to processing metadata? had you try testing the stream stability with metadata disabled?

Hi Jerry,
By “processing the metadata” do you mean the calls to getMetadataSize() and getMetadata() ?

hello chris.wallett,

oh… I meant disable the embedded metadata output from sensor driver side. you should also configure device tree property, embedded_metadata_height=0; to set sensor embedded metadata height in units of rows.

Hi Jerry,
I have carried out the test and disabling the metadata makes no difference.
I tested using nvargus-daemon running under “Normal” priority (i.e. running as a service via systemctl) and my test apps using real time priorities.
The version using sensor metadata had a 9.7% frame drop rate (over a 5 minute test); the vesion where no sensor metadata was being sent by the sensor still resulted in an 8.8% frame drop rate.

Hi Jerry,
Any response ??

hello chris.wallett,

may I also know what’s the IMX568 output data-rate?
please refer to Jetson Nano Module Data Sheet, the max throughput of Jetson Nano is 1.5Gbps.

Hi Jerry,
We are using the IMX568 in two channel MiPi mode, 2488x2048 10 bit data @ 40 frames/sec.
I think that this is 204Mpix/s (Nano limit is 1400Mpix/s) and ~1.0 Gbps per channel (Nano limit is 1.5Gbps).
So I think that we are inside the Nano limits.
Chris W.

hello chris.wallett,

let’s narrow down the issue by testing with v4l2 IOCTL to check sensor basic functionality.
for instance,
$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --set-ctrl bypass_mode=0 --stream-mmap

Hi Gerry,
This is what I get using the 2472x2048 image size supported by the driver.
QRO-Camera-Log.log (16.4 KB)
FPSData.zip (14.3 KB)
Is this what you would expect?

hello chris.wallett,

could you please execute argus_camera, and please enable --kpi options to obtain the results.

the normal data type is separated from the embedded data type.
there’s capability to configure PP (pixel parser), you may dig into TRM, for [ Embedded Data Capture Modes],
note, it’s by default using Embedded Capture using Two PPs.

Hi Jerry,
Sorry but I’ve never used argus_camera.
Can you point me to some futher info on how to install (or build) it and how to use it, and what sort of information you are looking for from it.

hello chris.wallett,

it is included in the MMAPI package, $ sudo apt install nvidia-l4t-jetson-multimedia-api
please follow /usr/src/jetson_multimedia_api/argus/README.TXT for the steps to build the argus_camera application.