Ubuntu 10.04, GTX 275.
I’m trying to do the standard Ax=b linear algebra problem using Jacobi method.
When I run a.out the first and last values in the array become 100% correct and the middle numbers are off by about 0.001%.
Are there just unavoidable single precision floating point errors in CUDA?
Any insight as to what I could be doing wrong?
I would post my code but this is a project for my professor and I’m not sure he would be happy with me posting the code even if it is trivial.
edit: I stopped the fluctuating answer issue.