different results with Nsight


when I execute my application with visual studio nsight on/off, the rendered results are different:

I verified this with several different computers and now I don’t know where to search for the error.

My problem actually is, that the result produced with nsight turned on is the correct one. I only have a wild guess that maybe framebuffers are being initialized in a different way but I tried all that came to mind without finding the solution.

What could nsight be doing different that it produces those results with my app?

some details about the app:
opengl context: 4.4
all shaders are “#version 330 core”
tested on windows 8.1 with gtx770 and gtx660
rendering to a RGBA8 rendertexture
Nsight 4.0

Thanks in advance,

Have you gotten any info log message from the GLSL compiler when compiling or linking your shader?
If you suppose uninitialized data which is likely in this release vs. debug case, have you also looked through your GLSL shader code if there is any local variable used with uninitialized data perhaps?

Hello and thanks!

Your reply got me thinking and I double checked some programs and finally found the error, although now I am even more curious how running with nsight could produce the correct results (=

All programs link and compile clean and all variables are correctly initialized. The error was in a program function that calculated the (uv)w coordinate of a 3d lookup texture. I guess it was out of range and the nsight debug run clamped or remapped it to a correct value.