I wanted to create a C++ application to show, set and record camera feed (CSI MIPI camera). For the first iteration I wrote a short program which used OpenCV with GStreamer pipeline to open camera. It works fine, but when I tried to set the exposure or gain values it returns with and error:
Exposure: [ WARN:0@0.630] global cap_gstreamer.cpp:1914 getProperty OpenCV | GStreamer warning: unhandled property: 15
0
GAIN : [ WARN:0@0.630] global cap_gstreamer.cpp:1914 getProperty OpenCV | GStreamer warning: unhandled property: 14
0
So I started to look for a solution, but all I found was about Argus API. So I decided to start over and learn something new. Downloaded the required sources,followed the README.txt’s guidance and built, installed properly. I started to try out the samples. But it looks like none of them have the access to the frames, I run them with sudo command. I attach messages from the OneShot and the averageBayer examples:
sudo ./argus_oneshot
Executing Argus Sample: argus_oneshot
Argus Version: 0.98.3 (multi-process)
Capturing from device 0 using sensor mode 0 (2592x1944)
Failed to get IFrame interface
BayerAverageMap:
sudo ./argus_bayeraveragemap
Executing Argus Sample: argus_bayeraveragemap
Argus Version: 0.98.3 (multi-process)
PRODUCER: Creating output stream
PRODUCER: Launching consumer thread
CONSUMER: Creating context.
CONSUMER: Connecting to stream.
CONSUMER: Connected to stream.
CONSUMER: Waiting until producer is connected…
PRODUCER: Starting repeat capture requests.
CONSUMER: Producer is connected; continuing.
(Argus) Error Timeout: (propagating from src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 137)
(Argus) Error Timeout: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 91)
CONSUMER: No more frames. Cleaning up.
CONSUMER: Done.
(Argus) Error Timeout: (propagating from src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 137)
(Argus) Error Timeout: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 91)
(Argus) Error InvalidState: Argus client is exiting with 2 outstanding client threads (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 357)
PRODUCER: Done – exiting.
If I check the camera with the OpenCV applicationit sho picture properly also If I open the camera with the nvgstcapture-1.0 it also works properly.
My goal was to set the exposure and gain values from a custom application while checking the camera feed. But as much I know it is impossible with openCV (with CSI camera). SO that’s why i started over with libargus but now I am not able to open the camera even with the samples. I would be grateful for any help.
For the nvarguscamerasrc using the exposuretimerange and gainrange to set the exposure and gain. For the MMAPI sample problem may need to confirm the the MMAPI release match the BSP version.
Hello, thank you for your fast response. I have some additional information since my first post, I have a Jetson Nano with a fresh install and the imx219 default csi camera. On this board the samples worked properly and this board has the following specification:
4.9.253-tegra and R32 revision 7.1 nvidia-jetpack 4.6.4-b39
The problematic board is also a Jetson Nano board but it has a custom driver for ov5647 csi camera. I thought the driver works properly because I could open the camera with nvgstcapture-1.0 and also with a custom c++ application with OpenCV and GStreamer. But I now I have doubts about the driver and compared it with the default kernel drivers. To be honest the driver is based on the ov5693 and imx219 examples and did not fnd the possible source of the problem. This board has the following specifications:
I checked the NVIDIA JETSON LINUX DRIVER PACKAGE documentation but did not find any clue about the compatibility. Can you help me determine the source of the problem is it somehow the driver (but in that case I do not understand why the other applications capable to open the camera) or is it a version compatibility issue?
Based on the sample application’s messages the apps were not able to acquire image from the camera.
But with the suggested lines I got the following result:
nvidia-l4t-jetson-multimedia-api/stable, now 32.7.4-20230608212426 arm64 [installed]
nvidia-l4t-jetson-multimedia-api/stable, now 32.7.3-20221122092935 arm64
nvidia-l4t-jetson-multimedia-api/stable, now 32.7.2-20220420143418 arm64
nvidia-l4t-jetson-multimedia-api/stable, now 32.7.1-20220219090432 arm64
for
uname -r
4.9.337-tegra
When I installed the jetson I used the 32.7.4 package so I think it’s should be this version. Which other software component should I check in terms of version mismatching?
I tested it with the available resolutions (25921944, 19201080, 640480), but it only works with the 640480 resolution the other to exit with an error message with:
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
But width the 640*480 resolution it works properly.
Can it mean the driver higher resolution settings are bad and the argus sample tries to open the camera in a higher resolution mode and not able to read the frames? The only differences between these resolutions as much I know the register values. I have them from the linux repository and they look fine to me.