how to use binary variables in GPU

i generate a binary array in CPU using std::bitset<>. but when copying it to GPU i found error.
have you any idea to use bits in GPU.

STL isn’t supported in kernels, so you need to learn how to work with bits using plain C approach