Hi,
I have 2 GMSL ar0231-rccb-bae-sf3325 connected on PortA.
I am using a ROS driver to ingest the images. The ROS driver uses driveworks API and has the Camera work flow as mentioned in documentation. I get a consistent frequency of 30Hz when only the driver is working. As soon as I start ingesting the images in stereo disparity calculation ros node, the frequency drops down slowly and settles down somewhere between 20 and 25Hz.
I am not sure what is causing the drop. When I check the tegrastats, I see that, out of 6 cores of CPU, core 1,2,3 and 6 are running almost at 100% but 4 and 5 are almost empty. GPU shows around 50%.
The target application needs around 10Hz image frequency. Is it possible to reduce the frequency of camera from 30 to 10Hz? I saw many posts in the forum where this question was asked but no conclusion was found.
Yes. I meant frame rate.
The ros node continuously runs ‘Camera workflow’ in a while loop and posts the images on ROS topics. The images are timestamped from dwImage_getTimestamp(). ROS has the tools to check frequency of posted topics.
I can drop frames before posting but that will not help solving the main issue I believe. From what I understand the Frame Rate is decided by camera hardware and is not adjustable (Atleast I don’t know how, please correct me if I am wrong).
From the documentation I see that:
Every frame acquisition stores the image in an internal, low-level (under DriveWorks) buffer. The low-level NvMedia Image Processing Pipeline (IPP) outputs such a buffer. You can adjust the size of this buffer with the fifo-size parameter. DriveWorks applications use dwSensorCamera_readFrame() to take the frames from the low-level buffer. This function removes frames from the beginning of the FIFO and stores them in Camera. If the FIFO is filled, any other reading drops a frame (ICP drop).
So I suspect ICP drop is imminent(suspecting due to load on CPU?) and I cannot control it. Unless the low level buffer itself was filled with 10FPS and not 30FPS. Due to ICP drop I may not get consistently time spaced images.
So, you check with dwImage_getTimestamp() or the ROS tools? Could you share the ROS tools?
The timestamp from dwImage_getTimestamp() is added to the image properties in ROS message and is published as ros topic.
The following tool is used to measure the rate: http://wiki.ros.org/rostopic#rostopic_hz
For dropping frames, I meant your ROS node can drop some frames and doesn’t need to publish every camera frame. Won’t it help on the issue?
Yes, I will try this out. It will avoid some unnecessary image conversion + compression for all the images and should help the computational load.
Thanks
Just to confirm, in-order to drop an image this way can I do the following:
dwSensorCamera_readFrame()
dwSensorCamera_getImage()
dwImage_getTimestamp()
dwSensorCamera_returnFrame() 4b) Check if the difference is atleast the frequency required. If faster move back to 1) or else continue
dwImage_create()
dwImage_copyConvert()
Image Compression and Send
Want to confirm that, when 4) is executed it will pop the image from internal FIFO buffer right? So, that I read a new image next time and not the same.