# finding the average of vector using GPU

let say i have vector X = [1,2,3,4,5,6,7,8,9,10]. How to find the average of X for each 5 elements. So the output is Y= [3,8]. 3 is the mean of [1,2,3,4,5] and so on

i want to use cudnn batch normalization. But seems like we have to estimate the mean and variance of our input before calling cudnnBatchNormalizationForwardInference https://docs.nvidia.com/deeplearning/sdk/cudnn-developer-guide/index.html#cudnnBatchNormalizationForwardInference

``Try using the reduce function (cudnnCreateReduceTensorDescriptor).  Put it in a tensor NHWC as [1,1,2,5]. Then reduce it into a vector like [1,1,2,1], using  CUDNN_REDUCE_TENSOR_AVG.``