I am using the cudnn convolutional operation and I was wondering if there is a configuration in the API to wrap instead of pad the horizontal component?
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| how to only pad the start when performing 1D convolution | 0 | 460 | February 19, 2019 | |
| Best practice of cuDNN implementation | 1 | 549 | February 15, 2021 | |
| Question about the "SAME" convolution in cuDNN | 0 | 491 | March 11, 2019 | |
| Cudnn convlution with half output -inf | 1 | 546 | May 13, 2022 | |
| Does cudnn support Convolution in 4d or higher dimensions..?? | 0 | 529 | August 16, 2018 | |
| Depthwise convolution in cudnn fp16 is slow than fp32 | 6 | 1472 | October 18, 2021 | |
| Max convolution dimension supported | 0 | 586 | February 11, 2019 | |
| Tensor RT implements convolution efficiently | 0 | 684 | July 5, 2017 | |
| Is zero-padding not supporting in conv with cuDNN v2??? I got error CUDNN_STATUS_BAD_PARAM | 3 | 3504 | September 28, 2015 | |
| Transposed Convolution | 5 | 3018 | October 7, 2022 |