Most Efficient Temporal Mean Looking for a method of getting the mean of a pixel through time

Has anyone developed or thought about doing the mean of a pixel through time on the GPU efficiently? On a CPU it would generally look like…

for each frame
for each column
for each row
avgFrame[row,col] = frames[row,col,frame]/numFrames

Which more efficiently steps through memory than having the frame be the inside loop. How would this translate to the GPU?