Hardware - NVIDIA GeForce RTX 3060 Laptop GPU
Hardware - 11th Gen Intel(R) Core™ i7-11800H @ 2.30GHz (16 CPUs), ~2.3GHz
Operating System - Ubuntu 20.04.3 LTS
Riva Version 1.10
How to reproduce the issue ?
I had problems in the installation and I followed the recommendations presented in another forum :
- Comment out the models that will not be used in the config.sh file, details on where config.sh is located is available in the quick start guide
- In my case I only leave the asr service enabled because I work with “es-US”
- I run
bash riva_clean.sh and delete the previous images and volumes that I had downloaded and created.
I attach my configuration file
config.sh (8,6 KB)
bash riva_init.sh and got the following output
riva_init_log.txt (22,1 MB)
bash riva_start.sh and got the following output
riva_start_log.txt (2,2 KB)
In the last line that shows
riva_start it tells me Check Riva logs with: docker logs riva-speech but I don’t have any container with that name
Any suggestions to get RIVA 1.10 working properly?
I think the laptop GPU has insufficient VRAM.
IIRC in a previous post, someone stated that VRAM usage can go up to 12 GB even with commenting out certain models
Same problem with Riva 1.10 on Titan RTX …
SOLVED: run riva_clean.sh and install again. Worked for me.
Hi @leortiz ,
Thanks for your interest in Riva
Also Thanks to @200857g and @vanessa.crosby for your valuable inputs,
Thanks for sharing the logs and config.sh,
Perhaps the execution of riva_init.sh has failed with error, thats why the riva_start.sh didn’t work,
I will check the error that has occurred at riva_init.sh with the team, just a guess might be due to GPU memory, but will confirm the cause of the error shortly,
Just a quick try in meantime,
- In the config.sh, we have two ASR models, can we try with only one of them and check if it works, by commenting one of the model with # as higlighted in screenshot below
bash riva_init.sh and continue the flow (also share logs again if unsuccessful)
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.