How to change chunk_size in quick start scripts?

Please provide the following information when requesting support.

Hardware - CPU
Operating System
Riva Version: 2.6

I’m doing audio to text transcription through the StreamingRecognize API, I have a lot of latency, I want to use chunks of 160ms size to improve latency. Is it possible to configure this from the riva quick start scripts or should I build new pipelines with riva-build?

HI @nharo

Thanks for your interest in Riva

Apologies, unfortunately, we won’t be able to configure chuck size from quick start, it can only be tweaked manually using --chunk_size flag in riva-build and deploy process

https://docs.nvidia.com/deeplearning/riva/user-guide/docs/asr/asr-pipeline-configuration.html

Thanks

I am trying audio to text transcription through the StreamingRecognize API, But it is never working, I have running container as well as riva-servicemaker,
I can run example notebooks perfectly fine.

Can you tell your way of doing this, How you did transcription using chunks, as I am trying to do it with flask socketio audio buffer.

Thank you.

Hello, you can use the nvidia-riva/samples repository where there are examples of the use of the API in python or through the Riva Websocket Bridge repository for JavaScript

1 Like