Hardware - GPU T4
Operating System: Docker (Riva Server image)
Riva Version: 1.6.0
I have a RIVA ASR model deployed and successfully performing streaming ASR using gRPC with Node.JS.
However, I sometimes get the following error intermittently:
E1105 15:57:29.546823 7732 libriva_asr.cc:141] The inference failed: in ensemble 'Cn-SpeUni256-EaTl380-mn2', inference request for sequence 1714898170 to model 'Cn-SpeUni256-EaTl380-mn2-feature-extractor-streaming' must specify the START flag on the first request of the sequence
Before streaming audio, I make sure the server is ready and send the Streaming Config request before sending any other audio.
Is it possible this is a server side error caused by some sequencing?
Below is a longer stack trace of the logs before and after the error:
I1105 15:56:23.809695 10552 grpc_riva_asr.cc:870] Using model Cn-SpeUni256-EaTl380-mn2 for inference
I1105 15:56:23.809767 10552 grpc_riva_asr.cc:886] Model sample rate= 16000 for inference
I1105 15:56:24.383030 10552 riva_asr_stream.cc:219] Detected format: encoding = 1 RAW numchannels = 1 samplerate = 16000 bitspersample = 16
I1105 15:57:29.545948 7733 grpc_riva_asr.cc:590] Send silence buffer for EOS
I1105 15:57:29.545979 10628 grpc_riva_asr.cc:590] Send silence buffer for EOS
I1105 15:57:29.546018 7613 grpc_riva_asr.cc:590] Send silence buffer for EOS
I1105 15:57:29.545966 10217 grpc_riva_asr.cc:590] Send silence buffer for EOS
I1105 15:57:29.545991 10556 grpc_riva_asr.cc:590] Send silence buffer for EOS
I1105 15:57:29.546125 10450 grpc_riva_asr.cc:590] Send silence buffer for EOS
I1105 15:57:29.545979 10582 grpc_riva_asr.cc:590] Send silence buffer for EOS
I1105 15:57:29.546255 8730 grpc_riva_asr.cc:590] Send silence buffer for EOS
E1105 15:57:29.546823 7732 libriva_asr.cc:141] The inference failed: in ensemble 'Cn-SpeUni256-EaTl380-mn2', inference request for sequence 1714898170 to model 'Cn-SpeUni256-EaTl380-mn2-feature-extractor-streaming' must specify the START flag on the first request of the sequence
E1105 15:57:29.546990 7611 libriva_asr.cc:141] The inference failed: in ensemble 'Cn-SpeUni256-EaTl380-mn2', inference request for sequence 229575147 to model 'Cn-SpeUni256-EaTl380-mn2-feature-extractor-streaming' must specify the START flag on the first request of the sequence
E1105 15:57:29.547165 10216 libriva_asr.cc:141] The inference failed: in ensemble 'Cn-SpeUni256-EaTl380-mn2', inference request for sequence 2009964687 to model 'Cn-SpeUni256-EaTl380-mn2-feature-extractor-streaming' must specify the START flag on the first request of the sequence
E1105 15:57:29.547258 10449 libriva_asr.cc:141] The inference failed: in ensemble 'Cn-SpeUni256-EaTl380-mn2', inference request for sequence 2086559401 to model 'Cn-SpeUni256-EaTl380-mn2-feature-extractor-streaming' must specify the START flag on the first request of the sequence
E1105 15:57:29.547461 10581 libriva_asr.cc:141] The inference failed: in ensemble 'Cn-SpeUni256-EaTl380-mn2', inference request for sequence 554894007 to model 'Cn-SpeUni256-EaTl380-mn2-feature-extractor-streaming' must specify the START flag on the first request of the sequence
E1105 15:57:29.547472 8728 libriva_asr.cc:141] The inference failed: in ensemble 'Cn-SpeUni256-EaTl380-mn2', inference request for sequence 149722295 to model 'Cn-SpeUni256-EaTl380-mn2-feature-extractor-streaming' must specify the START flag on the first request of the sequence
I1105 15:57:29.564607 10600 grpc_riva_asr.cc:975] ASRService.StreamingRecognize returning OK
I1105 15:57:29.564905 10323 grpc_riva_asr.cc:975] ASRService.StreamingRecognize returning OK
I1105 15:58:09.361279 10921 grpc_riva_asr.cc:799] ASRService.StreamingRecognize called.
I1105 15:58:09.361320 10921 grpc_riva_asr.cc:833] ASRService.StreamingRecognize performing streaming recognition with sequence id: 720730861