Hi
I am now trying to set up CL/GL Interoperability using clCreateFromXDTexture(), but i am wondering what is the correct GL internal format to use for the CL types:
char
uchar
short
ushort
int
uint
float
so that they will be actually what we expect (f. ex. uint = 32bit unsigned int, and float = 32bit float), but so that the textures will be also usable
in GL, possibly even with the fixed pipeline (where possible, like char, uchar, float at least)
I am asking this because i am getting an error -30 (CL_INVALID_VALUE) when i try to invoke something like
clCreateFromGLTexture2D (*m_clContext, CL_MEM_READ_WRITE, GL_TEXTURE_2D, 0, m_glItem, &clErr)
while it works properly with clCreateFromGLBuffer();…
so i guess there must be either some error in the internal format i am using, or a bug in the nvidia drivers
thanks!
edit: here’s the code i use to create the GL storage:
glGenTextures(1, &m_glItem);
glBindTexture(GL_TEXTURE_2D,m_glItem);
glPushClientAttrib(GL_CLIENT_PIXEL_STORE_BIT);
glPixelStorei(GL_UNPACK_ROW_LENGTH,m_uWidth);
//glPixelStorei(GL_UNPACK_ALIGNMENT,sizeof(Type));
glPixelStorei(GL_UNPACK_ALIGNMENT,1);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_ALPHA, //GetGLInternalFormat(),
m_uWidth,
m_uHeight,
0,
GL_ALPHA, //GetGLFormat(),
GL_UNSIGNED_BYTE,//GetGLType(),
NULL
);
checkforGLerrors();
glPopClientAttrib();
glBindTexture(GL_TEXTURE_2D,0);