So *data points to where the raw image data is for YCrCb channels. When i read values from that location i see that the values change when I move the camera. So the location i am trying to read should be correct. But I still cannot figure out how the channels are mapped into the location?
According the egl library the way is saves raw images on the buffer depends on the producer (LibArgus)
So the pixel format i have selected is - PIXEL_FMT_YCbCr_420_888.
I tried using NVbuffer but it makes everything slow and it crashes randomly.
In the example yuvjpeg I see the following code printing some sample data on the buffer
for (uint32_t i = 0; i < iImage->getBufferCount(); i++)
{
const uint8_t d = static_cast<const uint8_t>(iImage->mapBuffer(i));
if (!d)
ORIGINATE_ERROR(“\tFailed to map buffer\n”);
Could you provide a patch for us to reproduce the issue with any sample in ~/tegra_multimedia_api/?
We can take a look if anything is wrong in using the APIs.
NvBufferMemMap() and then NvBufferMemSyncForCpu() The video stream i get using this is not good as the one I get using the argus_camera example. They seem to use OpenGL as a consumer if i understand correctly.
I am able to get around 30 FPS but cannot get 60FPS from camera because it takes time to the functions shown bellow.
If i try to display an images taken from frame consumer-> convert it to RGB-> display on a GL window. It is not as crisp as what i see in your examples.
Also some times it looks like displays the a previous frame randomly from aqureFrame(). (I check it with a stop watch in front of the camera)
The examples which has openGL viewing the camera works well. The best case is to Read the GL buffer to get a frame. But unfortunately the buffer seems to be empty all the time.
It would be better if we had a better implementation which uses less CPU.
Currently I am using IMX185 which is 2MP 60FPS. I am planning to use 4k camera with 30FPS in future. I am not sure if it will eat all the CPU left when getting the image.
I have the same problem as you have mentioned:
“I get these black lines moving around the image randomly in the Y channel when i use the APIS in nvbuf_utils.h”.
I am using TX1 Jetpack 3.0, LI-TX1-CB and two IMX290. Any ideas?
I have the same problem as you have mentioned:
“I get these black lines moving around the image randomly in the Y channel when i use the APIS in nvbuf_utils.h”.
I am using TX1 Jetpack 3.0, LI-TX1-CB and two IMX290. Any ideas?
I am aware of this. But the person who asked this question was using JetPack 3.1, and he had the same issue with black lines. Do you have any explanation about it?
NvBufferMemMap and then NvBufferMemSyncForCpu or NvBufferMemSyncForCpu and then NvBufferMemMap?
And why with Jetpack 3.1 on TX2, after exporting DISPLAY=:0, I cannot run Argus program via ssh due to failure to initialize EGL display? There was no such issue in the previous version.
Invalid MIT-MAGIC-COOKIE-1 key(Argus) Error NotSupported: Failed to initialize EGLDisplay (in src/eglutils/EGLUtils.cpp, function getDefaultDisplay(), line 75)
(Argus) Error NotSupported: Failed to get default display (in src/api/OutputStreamImpl.cpp, function initialize(), line 80)
(Argus) Error NotSupported: (propagating from src/api/CaptureSessionImpl.cpp, function createOutputStreamInternal(), line 565)