Has anyone developed or thought about doing the mean of a pixel through time on the GPU efficiently? On a CPU it would generally look like…
for each frame
for each column
Has anyone developed or thought about doing the mean of a pixel through time on the GPU efficiently? On a CPU it would generally look like…
for each frame
for each column