I was trying to build an application where an image and video has to be streamed simultaneously. But after I found the nvgstcapture exe link in my jetpack api, I found that my work can be eased. But I was unable to find the source of this executable link file. Can I get the file directory of the source or can I get any source zip of this NVGSTCAPTURE so that I can proceed with my work.
Please download it and give it a try. If you use other releases, please go to L4T Archive.
Thank you. That was really helpful.
Hello, Now I’m familiar with this NVGSTCAPTURE source. I found that we can set the capture parameter values such as whitebalance, saturation, etc. But I’m looking to switch between different values of any specific parameter dynamically (in runtime). I tried few methods, but not really successful. Can I get any help for this?
Now I found a solution for this dynamic change of parameters. This can be achieved by just adding a while(1) or any endless loop or even switch case which can get value from the user and change the value at the parameter structure. This while has to be at the end of the main function, since this works using the gstreamer pipeline (which is source-sink)
The snippet would look like
while(1) //only for whitebalance
printf(“Enter the white balance value”);
g_object_set (G_OBJECT (app->ele.vsrc), “wbmode”, wb_val, NULL);
return ((app->return_value == -1) ? -1 : 0); //return statement
For tuning your camera parameters, you may find easier to use argus_camera application having a GUI with controls.
If not yet done, you may build and install with:
sudo su cd /usr/src/jetson_multimedia_api/argus/ # See /usr/src/jetson_multimedia_api/argus/README.txt for details and additional dependencies. mkdir build && cd build cmake .. make -j6 make install exit # Now run it: /usr/local/bin/argus_camera &
In the source code provided I found out that there is a typedef structure AuxBufferData with a void pointer sensor_data.
When I tried to print the void pointer sensor_data it printed 5 different addresses.Can you kindly confirm as to what it is pointing to?
Can you also explain as to how to find the size of the sensor_data and whether we can get raw/bayer image from the pointer sensor_data.
We need the raw /bayer image that the camera provides.
The solution you provided worked well. Can you tell where to find the source and if it is not here in the jetson, can you please provide me with the same.
Thanks in advance.
AuxBufferData structure is deprecated and it does not point to raw image. Through Argus the frames are processed in ISP engine. For getting raw image, please run v4l2-ctl command. Pleaser refer to this command:
[32.3.1] v4l2-ctl raw streaming NOT working with OV5693 after using argus_camera app
Is there any other way through which I can get the raw frame using argus (i.e Before the frames are processed in ISP engine) ?
No. For getting raw frames, the only way is to run v4l2-ctl command.
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.