I am having a very strange problem with an OpenGL application that I am developing. I have a simple scene with 5 3D Objects, a terrain of 16000 vertices, water (using 2 draws for reflection and refraction) and an instanced 3D object with 1000 instances and a few post processing effects. When I run the application under the debugger (and when I run the executable outside the debugger) I get an average of 30fps (20-50fps). So I decided to use NSight to try to find out where my bottlenecks are. So I installed NSight and run the application and under NSight I get an average frame rate of 200 fps (180-240fps)… I have absolutely no idea why this happens. Can anyone help me?