unsigned int iformat = format == 1 ? GL_ALPHA : format == 3 ? GL_RGB : GL_RGBA; unsigned int xformat = format == 1 ? GL_RED : format == 3 ? GL_RGB : GL_RGBA; // GL_ALPHA fails as iformat in glTexImage2D(). Yet, it's listed as a cromulent format in the spec. // iformat = GL_RED; glTexImage2D(GL_TEXTURE_2D, 0, iformat, w, h, 0, xformat, GL_UNSIGNED_BYTE, NULL);
This code fails for textures of 1 byte per pixel – GL_ALPHA as internal format and GL_RED as external format should be supported according to the specification for OpenGL 4.2: http://docs.gl/gl4/glTexImage2D
If I set the internal format to GL_RED then it’s accepted, but that samples the texture differently than the specified behavior for GL_ALPHA.
(Which I can compensate for in my shader, but … what’s wrong with GL_ALPHA ?)
The renderer/jetpack version is default as delivered from Amazon:
Renderer: NVIDIA Tegra X1 (nvgpu)/integrated
GL version: 4.2.0 NVIDIA 32.1.0
GLSL version: 4.20 NVIDIA via Cg compiler