Fast File IO Need to read my data files faster to GPU abilities

I’m not familiar with C (have been programming in C#, Fortran. SQL mostly)

To make the best use of the awesum speed of GPU I need faster file reading than my newbie C is giving.
I have written code that reads binary files very nicely, but the code I have written for reading asciiGrids and time series files seems slow.

So for text files such as this (but this is only a tiny bit of it) is there a way to read the data grid that is significantly better than the code below (but not to complicated)

Thanks in advance for any help :)

---- sample file ( full file has heaps more columns and rows and fields may be) —
NCOLS 9117
NROWS 1219
Elem:WQD 19990101
127.625 127.6875 127.75 140.4375
793.75 125.0 6.25 18.125
17374.0625 4.375 2.5 12.5
17378.5625 4.375 125.0 62.5
3.6 3.7 3.9 4.1
0.1 0.1 -8.1 -9999.9

---- code ----
// NB another chunk of code reads the header //
int readGridF( float *h_grid, FILE fp, long rows, long columns )
for (int rr = 0; rr < rows; rr++)
for ( int cc = 0; cc < columns; cc++)
fscanf( fp, “%f”, &h_grid[rr
columns + cc]);
return (ferror(fp));

If you were working on a POSIX compliant operating system, I would suggest using mmap() to map the file into memory and read directly from that, but you are using Windows and I am not sure how you would do that using the Win API.

I guess the key question is where line breaks appear in the file. You should get considerably better performance if you use something like fgets() to read a complete line at a time from the file into a string, and then parse the individual array entries from the string. Suggesting code to do that requires some a priori knowledge of where the line breaks are.

On standard C, I will just read the entire file in memory at once (if it’s under x GB! ;-) ), and then parse the content creating pointers on relevant lines and avoiding copy of datas. That’s totally portable, and may be real fast if file fit in memory.