TSIA.
It seems that this is now the case since I noticed the model names are now suffixed with *-streaming-offline
if the --offline
flag is passed.
I would like to confirm that since it will significantly simplify our model deployment process.
TSIA.
It seems that this is now the case since I noticed the model names are now suffixed with *-streaming-offline
if the --offline
flag is passed.
I would like to confirm that since it will significantly simplify our model deployment process.
I’m also curious about this. This release broke our offline inference due to new limits on offline inference.
To reproduce run Recognize() on any file > 900s (15 minutes). Looks like what has happened is they have now added true Offline mode, whereas I believe it was previously streaming (not sure I understand how this was called offline?) for batch/offline jobs with a large chunk size.
@rleary Is the way to get the previous “offline mode” to simply use the current streaming pertained citrinet model but with larger chunk sizes?
I think that model name should probably not include streaming
in the name. I will verify. You will need to generate RMIRs and deploy separate online and batch pipelines, as the models are optimized upon deployment for different scenarios.
riva-trt-citrinet-1024-english-asr-offline-am-streaming-offline
This is what I get from building citrinet with --offline. Looks like streaming
is still included in the model filenames generated on riva-deploy.