I have been using fortran compiler version 10.6 to run my linear solver on tesla GPUs. Recently, I installed the trial version of the latest compiler 13.2 and now the very same code is giving not enough memory errors for different problems which I ran with 10.6.
The total memory used by code when run on cpu alone is about 300 MB so I fail to see why the tesla c2050 GPU would run out of memory.
0: ALLOCATE: 2299968 bytes requested; not enough memory: 30(unknown error) 0: ALLOCATE: 1823360 bytes requested; not enough memory: 4(unspecified launch failure)
I do some operations on GPU before these errors occur and the code where I get these errors is,
If I make the arrays allocatable then only the ‘not enough memory’ part of the error goes away and the code exits at the allocation statement.
Is there something wrong that I am doing here or is it an issue with the latest compiler.