GLSL not behaving the same between graphics cards

I have the following GLSL shader that is working on a Mac computer with an NVIDIA GeForce GT 330M, a different Mac computer with an ATI Radeon HD 5750, an Ubuntu VM inside this second Mac, but not on an Ubuntu VM inside a Windows machine with a GeForce GTX 780 (all drivers up to date). The shader is pretty basic, so I’m looking for some help what might be wrong! The vertex shader looks like (I’m using the cocos2d-x game engine, which is where all of the CC_{x} variables are defined):

varying vec4 v_fragmentColor;
void main() {
    gl_Position = CC_PMatrix * CC_MVMatrix * a_position;
    gl_PointSize = CC_MVMatrix[0][0] * u_size * 1.5f;
    v_fragmentColor = vec4(1, 1, 1, 1);
}

And the fragment shader:

varying vec4 v_fragmentColor;

void main() {
   gl_FragColor = texture2D(CC_Texture0, gl_PointCoord) * v_fragmentColor; // Can't see anything
   // gl_FragColor = texture2D(CC_Texture0, gl_PointCoord); // Produces the texture as expected, no problems!
   // gl_FragColor = v_fragmentColor; // Produces a white box as expected, no problems!
}

As you can see, I’m getting very strange behavior where both the sampler, CC_Texture0, and the varying vec4, v_fragmentColor, seem to be working properly, but multiplying them causes problems. I’m reasonably confident everything else is set up right because I’m seeing it work properly on the other systems, so it seems to be related to the graphics card or some undefined behavior that I’m not aware of? Also, I’m using #version 120 ( which was needed for gl_PointCoord). Thanks for any help!