Hi, I found something strange and interesting, when I use cudnn to do a deep learning project . I tried do Deconvolution operation ,so I used the cudnnConvolutionBackwardData function .
I found the same input data , every time the output result is different from others. just a litte different , like the previous result maybe 0.66732, and the current result be 0.66733.
besides, I tried the same experiments just use cuda or cpu, every time the results are the same.
maybe the little difference is not so much important, but the differences pass through so much net layers ,and the difference will be so much.
Does there any guys found this thing yet? do you evaluate how much this can affect your performance?