Here is the first pooling layer of VGG-16 net. I changed the pooling and stride to 3 but got full of zeros as the output. I tried various combinations of kernel(K) and stride(S) sizes. There seem to be a buggy output:

K=2, S=2 : OK (exactly the same result with caffe output)

K=2, S=3,4,5 : ZERO (Full of zeros)

K=3, S=2 : ok (difference to caffe output is around 10e-6 in average)

K=3, S=3,4,5 : ZERO

K=4, S=2,3,5 : OK

K=4, S=4 : ok

K=5, S=2,3,4,5 : ZERO

Here is the network prototype used for the experiment:

```
name: "deploy"
state {
phase: TEST
level: 0
}
layer {
name: "input"
type: "Input"
top: "data"
input_param {
shape {
dim: 1
dim: 3
dim: 300
dim: 300
}
}
}
layer {
name: "vgg_conv_1"
type: "Convolution"
bottom: "data"
top: "vgg_conv_1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
weight_filler {
type: "constant"
value: 0
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "relu1_1"
type: "ReLU"
bottom: "vgg_conv_1"
top: "vgg_conv_1"
}
layer {
name: "vgg_conv_2"
type: "Convolution"
bottom: "vgg_conv_1"
top: "vgg_conv_2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "relu1_2"
type: "ReLU"
bottom: "vgg_conv_2"
top: "vgg_conv_2"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "vgg_conv_2"
top: "output"
pooling_param {
pool: MAX
kernel_size: 3
stride: 3
}
}
```

Is there any comment for the situation?