Can cudnnConvolutionBiasActivationForward be performed in-place?

Per the documentation, cudnnConvolutionBiasActivationForward performs the following operation, where x,y, and z are data pointers:

y = act ( alpha1 * conv(x) + alpha2 * z + bias ).

My question is: is this call expected to always be successful if the operation is performed in-place? That is, can y and z be the same buffer? Further, can x, y, and z be the same buffer?

The documentation for cudnnConvolutionBiasActivationForward does not mention explicitly that in-place operation is supported, though a few other api calls in the documentation are listed explicitly as supporting in-place. I was able to successfully call this function with x = y = z and obtain correct numerical outputs, but I was not certain if this is guaranteed to work every time.

If this call is supported for in-place, in general, how can we tell whether a certain call in the cudnn library supports in-place or not?

Thanks!

Hi,
We are reviewing the documentation, and will keep you updated.

Thanks

Hi nathan.
Can you share your parameter for cudnnConvolutionBiasActivationForward function?
At this time, I am facing cudnn error 9(CUDNN_STATUS_NOT_SUPPORTED) when cudnnConvolutionBiasActivationForward is called.

I also set z buffer with x buffer. but it doesn’t work.
my activate function is RELU.

If i don’t need z calculation, can I set alpha2 set 0 ?

I guess other parameters have no problem because cudnnConvolutionForward, cudnnAddTensor, cudnnActivationForward functions are works well.

Can you give me some guide?

Thank you