maybe a cudnn bug

Hi, I found something strange and interesting, when I use cudnn to do a deep learning project . I tried do Deconvolution operation ,so I used the cudnnConvolutionBackwardData function .
I found the same input data , every time the output result is different from others. just a litte different , like the previous result maybe 0.66732, and the current result be 0.66733.
besides, I tried the same experiments just use cuda or cpu, every time the results are the same.

maybe the little difference is not so much important, but the differences pass through so much net layers ,and the difference will be so much.

Does there any guys found this thing yet? do you evaluate how much this can affect your performance?

We created a new “Deep Learning Training and Inference” section in Devtalk to improve the experience for deep learning and accelerated computing, and HPC users:

We are moving active deep learning threads to the new section.

URLs for topics will not change with the re-categorization. So your bookmarks and links will continue to work as earlier.


so , any one have the same question?