cudnnRNNForwardInferenceEx doesn't support zero sequence inside the batch, why?

We used cudnnRNNForwardInferenceEx for OnnxRuntime RNN/GRU/LSTM CUDA implementation. We encountered an issue recently while a user model has dynamic seqLengthArray which could have 0 in it sometimes, it cause a crash while call cudnnSetRNNDataDescriptor, and the error code is CUDNN_STATUS_BAD_PARAM. After checking the cudnn docs, it seems it doesn’t support 0 sequence which is quite strange. It can support shorter sequence (sequence < maxSequenceLength) why not 0 sequence?

CUDNN_STATUS_BAD_PARAM
Any one of these have occurred:

RNNDataDesc is NULL.
Any one of maxSeqLength, batchSize, or vectorSize is less than or equal to zero.
An element of seqLengthArray is less than or equal to zero or greater than maxSeqLength.
Layout is not one of CUDNN_RNN_DATA_LAYOUT_SEQ_MAJOR_UNPACKED, CUDNN_RNN_DATA_LAYOUT_SEQ_MAJOR_PACKED, or CUDNN_RNN_DATA_LAYOUT_BATCH_MAJOR_UNPACKED.