my bad,
disabled the wrong prop in menuconfig.
Restart with def-config and made my changes, and now working.
Thanks
The above command worked fine for me and I was able to capture the image data. I verified the image data using Irfanview. I obtained the yavta tool from here.
$ git clone git://git.ideasonboard.org/yavta.git $ make
Many thanks, it seems the git repo for yavta I chose was a weird mod (do not use this at home: GitHub - fastr/yavta: fork of git://git.ideasonboard.org/yavta.git ). So it captures frames now the command above. The “MW_ACK_DONE” error is not present either.
I chose the wrong repository as well - many thanks for saving my time!
Hello guys,
I was searching for a working Bayer image to RGB converter, since the ISP onboard still cannot be controlled, but wasn’t able to find one. Eventually, I intend to write some CUDA code to do the conversion on the fly and feed its output to the gstreamer compressor (that will not accept RAW data, as discussed somewhere in the topics).
Well for now, just as proof of concept I wrote a simple and crude converter in python:
[url]https://github.com/shondll/raw2rgb[/url]
Hop it can be of help to someone out there ;)
Thank amitev. You converter is very helpful.
by looking at the code written by amitev,
#define PIC_WIDE 1920
#define PIC_HIGH 1080
#define getPx(x,y) buf[(y)*PIC_WIDE + (x)]
uint16_t buf[1920 * 1080];
uint32_t image[1920 * 1080];
int main(int argc, char *argv[])
{
FILE * stream = NULL;
stream = fopen(image_path,"rb");
cv::Mat mat(1080, 1920, CV_8UC4, image);
fread(buf, 2, 1920*1080, stream);
uint16_t r, g, b;
r = 0;
g = 0;
b = 0;
for (int y = 1; y < PIC_HIGH - 1; y++) {
for (int x = 1; x < PIC_WIDE -1; x++) {
if ( (y % 2) == 0 ) {
if ( (x % 2) == 0 ) {
b = getPx(x, y);
r = (getPx(x - 1, y - 1) + getPx(x - 1, y + 1) + getPx(x + 1, y - 1) + getPx(x + 1, y + 1)) / 4;
g = (getPx(x, y - 1) + getPx(x, y + 1) + getPx(x - 1, y) + getPx(x + 1, y)) / 4;
} else {
g = getPx(x, y);
r = (getPx(x, y - 1) + getPx(x, y + 1)) / 2;
b = (getPx(x - 1, y) + getPx(x + 1, y)) / 2;
}
} else {
if ( (x % 2) == 0) {
g = getPx(x, y);
r = (getPx(x - 1, y) + getPx(x + 1, y)) / 2;
b = (getPx(x, y - 1) + getPx(x, y + 1)) / 2;
} else {
r = getPx(x, y);
b = (getPx(x - 1, y - 1) + getPx(x - 1, y + 1) + getPx(x + 1, y - 1) + getPx(x + 1, y + 1)) / 4;
g = (getPx(x, y - 1) + getPx(x, y + 1) + getPx(x - 1, y) + getPx(x + 1, y)) / 4;
}
}
b >>= 2;
g >>= 2;
r >>= 2;
image[y*PIC_WIDE + x] = (b | (g << 8) | (r << 16));
}
}
cv::imshow("hog-image.jpx", mat);
cv::waitKey();
return 0;
}
Hi all,
Was anybody successful in using the v4l2src plugin in gstreamer to stream the OV5693 camera? Any working pipelines available?
Hi,
We where able to capture using gstreamer, what happens is that the version distributed with Jetpack doesn’t support bayer RAW10 only RAW8 and when the v4l2src plugin tries to find a suitable format it’s never able to do so. Gstreamer needs to be patched in order to capture using v4l2src when the driver only supports RAW10
[url]Compile gstreamer on tegra X1 and X2 - RidgeRun Developer Connection
Then you should be able to run pipelines for example:
gst-launch-1.0 -vvv v4l2src ! ‘video/x-bayer, width=(int)1920, height=(int)1080, format=(string)rggb, framerate=(fraction)30/1’ ! fakesink silent=false
gst-launch-1.0 -vvv v4l2src num-buffers=10 ! ‘video/x-bayer, width=(int)1920, height=(int)1080, format=(string)rggb, framerate=(fraction)30/1’ ! multifilesink location=test%d.raw
Good Luck :) ,
JJ