cuDNN for 3D convolution

I am using cuda 8.0 and cuDNN v7.0.5 to accelerate standard convolution of volumetric images. I am taking a 3 dimensional image (2048 X 2048 X 141) and convolving it with a 3 dimensional filter (20 X 20 X 20). This is simply a speedup of standardized convn convolution routines in python, matlab, etc. so the output size should be the same as the input (2048 X 2048 X 141). From examples, and documentation I believe I should be using 4D tensors to do 3D convolutions so I have initialized the input and filter as follows using Nd descriptors in format NCHW with a 1 dimensional first dimension:

int convDim = 3;
// Set the padding such that the output size (feature map) matches the size
// of the input with strides of 1 (Standard convolution)
int padA = {0, 0, 0};
for (int i=0; i < convDim; i++) {
padA[i] = (filterdimA[i] - 1) / 2;
printf(“Dim: %d, padding: %d\n”, i, padA[i]);
}
int convstrideA = {1, 1, 1};
int dilationA = {1, 1, 1};
int dimA_four = {1, dimA[0], dimA[1], dimA[2]};
int filterdimA_four = {1, filterdimA[0], filterdimA[1], filterdimA[2]};
int strideA = {1, 1, 1, 1};

cudnnSetTensorNdDescriptor(cudnnIdesc, CUDNN_DATA_FLOAT, convDim+1, dimA, strideA);

cudnnSetFilterNdDescriptor(cudnnFdesc, CUDNN_DATA_FLOAT, filterFormat, convDim+1, filterdimA);

cudnnSetConvolutionNdDescriptor(cudnnConvDesc, convDim, padA, convstrideA, dilationA, CUDNN_CONVOLUTION, CUDNN_DATA_FLOAT);

To test the output dimensions I used:

int tensorOutputDimA[4];
cudnnGetConvolutionNdForwardOutputDim(cudnnConvDesc, cudnnIdesc, cudnnFdesc, convDim+1, tensorOutputDimA);

But have been getting CUDNN_STATUS_BAD_PARAM for the cudnnGetConvolutionNdForwardOutputDim function. The documentation for this function cuDNN version 7 says that the dimension of the convolution must be two dimensions less than the dimensions of the input descriptor. Does this mean I should be putting my input, filter and output descriptors into 5D arrays instead of 4D even though the convolution is only conducted on 3 dimensions?

Thank you for the help.