Nvidia RIVA - 2.6.0 gettting stuck after some time. Giving timeout error after sometime of inferencing

W1006 14:43:10.010982 15596 grpc_riva_asr.cc:795] Response timeout. requests sent: 3 received: 0
E1006 14:43:10.011229 15440 grpc_riva_asr.cc:1089] ASRService.StreamingRecognize returning failure

Hi @sumeet.tiwari

Thanks for your interest in Riva

for initial triage, Kindly request to share,

  1. the script/notebook/command used
  2. please share the complete output of command docker logs riva-speech
  3. config.sh file used

Thanks

Also saw these logs in docker logs riva-speech

I1012 23:05:55.424243 1359044 grpc_riva_asr.cc:919] ASRService.StreamingRecognize performing streaming recognition with sequence id: 1833129294
I1012 23:05:55.424638 1359044 grpc_riva_asr.cc:976] Using model citrinet-512-fil-PH-asr-online for inference
I1012 23:05:55.424770 1359044 grpc_riva_asr.cc:992] Model sample rate= 16000 for inference
I1012 23:05:56.145015 1359044 riva_asr_stream.cc:214] Detected format: encoding = 1 RAW numchannels = 1 samplerate = 8000 bitspersample = 16
I1012 23:05:56.145568 1362272 grpc_riva_asr.cc:709] Creating resampler, audio file sample rate=8000 model sample_rate=16000
error: unable to run model inferencing: Stream has been closed.
error: unable to run model inferencing: Stream has been closed.
error: unable to run model inferencing: Stream has been closed.
I1012 23:05:57.585235 1362272 grpc_riva_asr.cc:701] Send silence buffer for EOS
error: unable to run model inferencing: Stream has been closed.

I think there are memory leaks in Riva 2.6.0 inference engine.