Hi there, curious developer here.
I was reading an article on Tech Radar about the Grid. It’s from 2013 but I would imagine it’s still relevant today.
In particular, I’m curious about this graph:
http://cdn1.mos.techradar.futurecdn.net//art/internet/Nvidia/Grid/nvidiagrid%20(6)-650-80.jpg
How exactly was lag being measured? I see the Grid solution had a latency of 161 ms while the GeForce PC had one of just 65 ms. Are you measuring how long it takes each frame to complete from input device, through the update/render pipeline and to display?
Cheers