Two running OpenGL applications cause strange performance regression in 3xx drivers

In my kiosk linux distributive start script run two rather simple OpenGL applications (demo of games), both windows are minimized and a small GUI program which switches (minimize current game and maximize to fullscreen required one) between this demos if user pressed corresponding button. My previous release is based on 270.41.06 nvidia driver and works fine, but current driver versions (up to 313.18) have strange performance regression. First started application works without any problems but second became extremely slow when activated.
And of course there is no CPU-eating processes at this moment. Normal performance of both programs can be restored if switch system to text console and back to X. Also app work ok if other closed. Each of this demos loads approximately 200 Mb of 2D textures (raw and s3tc compressed) and I have 512 mb of RAM on video card. I don’t know exactly, but it seems that the problem is in the caching when not all data can loaded to video memory. But previous drivers do this job much better…

Can you please attach an nvidia-bug-report.log.gz (rename it to .jpg to get the forum to let you attach it)? Also, which applications, exactly, are you using? Finally, which desktop environment are you using for this?

My system is based on Debian Wheezy (i386), DE is GNOME 3.4.2 (GNOME classic mode). Applications are fullscreen SDL 2D OpenGl games (not publical available) which don’t use shaders or extensions. Bug report attached.

I don’t see anything obvious from your bug report. To debug this, we’re going to need to be able to reproduce the problem here. Can you send us a copy of the applications?