Most Efficient Temporal Mean Looking for a method of getting the mean of a pixel through time

Has anyone developed or thought about doing the mean of a pixel through time on the GPU efficiently? On a CPU it would generally look like…

for each frame
for each column