Optimization available for large 3D constant arrays


I have these large arrays of constants. On the CPU they were originally 3D, but I made them 1D on the GPU.

Their access patterns exhibit 3D spatial locality, but they are only accessed sparsely (each individual slot of the array is only accessed 1-2 times). They are very large and can exceed 200x200x200 floating point numbers. They are constant, but their dimensions are read in from files and and are not known on compile time.

My question is: what optimization is available and effective for this type of data? Also, is there any optimization for data very similar to this, but not constant?

Thank you.

ArrayFire makes indexing into 3D GPU arrays really simple. Have you looked at that? Free to use for a single GPU. It is well-suited to the data sizes you mention and to the use case of not knowing beforehand the exact data sizes (e.g. ArrayFire makes runtime decisions that optimize for incoming data size). Good luck!