Two sided lighting on geforce cards

On my geforce GT 450 I get a massif framerate break down,
when I activate two sided lighting:

4 MIO indexed triangles
From 100 FPS to 4 FPS

when I do the same on an ATI card(HD 5750)
the framerate is constant at 95 FPS

It seems to me that nvidia do this on purpose,
because they want to sell their quadro cards.

OR is this a driver bug?

I have similar problem on GTX690.

If enable two sided lighting by
,framerate is slower from 600 FPS to 90 FPS.

fragment shader can receive gl_Color of proper side color value.
In vertex shader, set the color to gl_FrontColor and gl_BackColor.
(The method is explained in the glsl spec.)
It works right, but speed is very slow.

So. I have changed a little bit.
In vertex shader, set the color to go_FrontColor and gl_FrontSecondryColor.
In fragment shader, set the gl_FragColor with testing gl_FrontFacing.
if(gl_FrontFacing) gl_FragColor = gl_Color; else gl_FragColor = gl_SecondryColor;
It works well and fast.

And other method is using out(vertex shader), in(fragment shader) variables instead of secondry color.

Nevertheless, I can’t find the problem with glEnable(GL_VERTEX_PROGRAM_TWO_SIDE);
I’v tested with ATI graphic card on my labtop, and found no problem.

Can somebody from NVidia tell us, what the reason for that bug is?

Lazyness for implementing old deprecated stuff? Both commands are removed in core profile. Just use gl_FrontFacing instead

@degasus do you have a sample implementation for two sided lighting? as vertex/fragment shader?

heisenberg: just use any sample implementation and change the normal calculation. The usual way is:
float intensity = dot(normalize(normal), normalize(color_direction))
For dual sided lightning, you have to change it to:
float intensity = abs(dot(normalize(normal), normalize(color_direction)))