NLP model token limits

What’s the status of Token/sequence limits on NLP tasks as of riva 1.10.0?

Has the Q/A sequence limit been raised?
Release notes don’t list it as a known limitation.

Would appreciate some updates on this

Hi @ShantanuNair ,

Thanks for your interest in Riva,

Apologies for the delay in response,

Most of our NLP models are bert based, we support a token limit of upto 512, any higher than that would be truncated to first 384 tokens