Hello
I want OpenCL to use OpenGL Texture with interop.
However, when I use read_imageui , the value returns a very big number
for example, if R value in OpenGL Texture is 1, the value returned by read_imageui is 998277249.
Details:
I create a cube map in OpenGL which goes well.
glGenTextures(1, &this->gl_evnTextureHandle);
glBindTexture(GL_TEXTURE_CUBE_MAP, this->gl_evnTextureHandle);
glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_X, 0, GL_RGBA8,
cubeLenth, cubeLenth, 0, GL_RGBA, GL_UNSIGNED_BYTE, this->m_cubeTex[0]);
// other 5 face …
//than I share the texture with opencl
m_cubeTexCL[0] =
clCreateFromGLTexture2D(this->m_GPUComputer->getContext(),CL_MEM_READ_ONLY,GL_TEXTURE_CUBE_MAP_POSITIVE_X,0,this->gl_evnTextureHandle,&erro);
oclCheckError(erro, CL_SUCCESS);
//with sampler
this->m_pixelSampler = clCreateSampler(m_cxGPUContex, CL_FALSE, CL_ADDRESS_CLAMP_TO_EDGE, CL_FILTER_NEAREST, &m_ciErrNum);
shrCheckError(m_ciErrNum,CL_SUCCESS);
// than run kernel
this->m_ciErrNum = clEnqueueAcquireGLObjects(this->m_cqCommandQueue,6,this->m_cubeMapCL,0, NULL, NULL);
oclCheckError(m_ciErrNum, CL_SUCCESS);
//invoke clEnqueueNDRangeKernel to run kernel this->m_cubeMapCL[0] for example
this->m_ciErrNum = clEnqueueReleaseGLObjects(this->m_cqCommandQueue,6,this->m_cubeMapCL,0, NULL, NULL);
oclCheckError(m_ciErrNum, CL_SUCCESS);
// the cl code is :
// read from 2D texture, out is defined as “__global uint *”
uint4 color = read_imageui(mapFace, simpleSampler, coordinate);
out[tid] = color.x;
I find:
if R in OpenGL Texture is 0 , color.x also returns 0
if R in OpenGL Texture is 256 , color.x also returns 0
if R in OpenGL Texture is 127 , color.x returns 1056898815
if R in OpenGL Texture is 1, color.x returns 998277249
if R in OpenGL Texture is 255, color.x returns 1065353216
It seems something wrong with the format, but I don’t now where it is.
thanks for your attention and help