Memory Size

I have to write a CUDA function with a param of an array of bytes…
When i copy the array to the gpu how do i know if there is
enough space on the GPU for the array?

there is a cuMemGetInfo function which will let you query the total memory and the free memory on the device before you call cudaMalloc. (Even though cuMemGetInfo is technically part of the driver API, you can safely use it from the Runtime API as well.)