Rendering a full screen quad of 640x480 generates around 300k pixels shaded.
Using a shader as simple as: out_color = texture(sampler, coord);
The frame profiler counters box for that draw call shows around 1.2 million “shd_tex_requests” and 4.9 million bytes for “shd_tex_read_bytes”.
The input texture is a 640x480 pixels RGBA with 8 bits per channel.
The sampler is set to nearest for min and mag filters, multisampling disabled.
For this kind of set up I would expect shd_tex_requests to be 300k and shd_tex_read_bytes 1.2M. But all results in any of the shaders I am using is getting four times the value I was expecting. I might have misunderstood the meaning of this counters.
I’m running Windows10, VS2015, nsight 126.96.36.19994 and nvidia drivers 364.51. The project is using OpenGL 4.0.
Can anyone provide any light on this matter?