GPU Shader ( GLSL ) Time Measurement Functions Seeking function that works on MS Vista

Hi, pardon me if my post is posted at wrong place but I guess it is related to CUDA discussion too. I am migrating from GLSL programming to CUDA programming. I have GeForce 8800GS and AMD Dual Core CPU.

I wrote a GLSL shader function that works well and I wish to measure the time taken to judge how fast my 8800GS is.

For this I initially used GetTickCount() but was advised to look for other as it has a limited resolution. Random search lead me to timeGetTime() — unfortunately, to my dismay I discovered it to be unavialable on MS Vista Operating System.

Then I came across this [url=“How to Return a Value in JavaScript”]http://cplus.about.com/od/howtodothingsi2/a/timing.htm[/url] but again it raises doubts on Vista.

Q1. Has anybody used QueryPerformanceFrequency() , as mentioned in the link at above line ?

Q2. Do you recommend any other function for GPU time measurement ? I need to have a resolution in milli sec atleast ?

My motivation of asking this question is after reading about time measurements sec in CUDA programming guide where for threads sharing memory __synchronizethreads() is used and special time constructs are used for the same purpose.

Q3. In case of GLSL, it is multi-threaded too but each thread is independent of other and this lead me not to worry about synchronization() … is it ok for GLSL case ???

Please guide me. Thank you all.