With driver NVIDIA-Linux-x86_64-285.05.09 the point size could be really big - as big (in pixels) as the screen in fact. I have shaders relying on this.
Now with 304.51 driver,
glGetFloatv(GL_POINT_SIZE_MAX, &max_point_size );
sets max_point_size to:
Is this intended, or is it a regression?
I can verify that it’s also present in 304.60 (latest driver).
Have you tried to use higher values to see if that is the actual limitation or if it is something that just went wrong with reading the limitation?
I do not use Linux, so I cannot really help you any further … hope someone else comes along.
Maybe you should try to post in this forum section instead?