Possible shader miscompilation (findLSB builtin and uint64_t)


I have the following simple GLSL fragment shader:

#version 460    
#extension GL_EXT_shader_explicit_arithmetic_types_int64 : enable
layout(location = 0) out vec4 out_color;
layout(location = 0) flat in uint64_t in_a;

void main() {
    if(findLSB(in_a) == 0u)
        out_color = vec4(0.5);
        out_color = vec4(1);

The value of in_a is set to 1 in the vertex shader. The expectation is that the out_color should be grey but it’s white instead.

I’ve replaced the if() with if(in_a == 1u) and in_a appears to be 1 as expected.

I’ve also replaced the if() with if(findLSB(uint(in_a)) == 0u) and that returns grey which is also expected.

So there must be something wrong with findLSB and 64bit integers.

This is the relevant parts of the generated SPIR-V:

%1 = OpExtInstImport "GLSL.std.450"
%ulong = OpTypeInt 64 0
%long = OpTypeInt 64 1
%9 = OpLoad %ulong %in_a
%11 = OpExtInst %long %1 FindILsb %9

I’m not a SPIR-V expert but the generated code appears to be correct.

Hmm maybe it’s not an issue with nVidia’s compiler. According to SPIR-V Extended Instructions for GLSL FindILsb accepts only 32bit values. So either GLSL.std.450 is not updated or glslang is emitting wrong code.