Fully connected layer using cuDNN library

You don’t need to use another library. Although the performance might be better with another library. Just set up the convolution with the weights being the same size as the input parameters. So, if you had a 4d NCHW tensor of dims of [4,1,28,28]. Then you would set the 2d convolution to have a slide of [1,1] dilation of [1,1] and a padding of [0,0]. The filter dims will be [x,1,28,28]. The output of the convolution will be [4,x,1,1]. The next convolution will have the same settings. This time though your filter will be [y,x,1,1]. Then after that same convolution settings. Your filter will be [z,y,1,1]. So on and so forth.

Recap.

Convolution 2D settings will always be: slide [1,1], padding [0,0], dilation [1,1].

PL == previous layer

Filter NCHW dims will be: [ (# of channels), (# of channels PL), (H of PL), (W of PL)].