Hello,
Could you who help give me a advice about how to start RIVA speech server on local or docker? Thks!
Hardware: Jetson-thor
Software: jetson sdk 7.0
Software part of jetson-stats 4.3.2 - (c) 2024, Raffaello Bonghi
Jetpack missing!
- Model: NVIDIA Jetson AGX Thor Developer Kit
- L4T: 38.2.1
NV Power Mode[0]: MAXN
Serial Number: [XXX Show with: jetson_release -s XXX]
Hardware: - P-Number: p3834-0008
- Module: Not available
Platform: - Distribution: Ubuntu 24.04 Noble Numbat
- Release: 6.8.12-tegra
jtop: - Version: 4.3.2
- Service: Active
Libraries: - CUDA: 13.0.48
- cuDNN: 9.12.0.46
- TensorRT: 10.13.3.9
- VPI: 4.0.0~er5
- Vulkan: 1.4.304
- OpenCV: 4.8.0 - with CUDA: NO
The issue on my running riva speech server:
I followed the guide of riva_quickstart_arm64_2.19.0, but, the riva server can't be stated, and the log showed:
> Riva waiting for Triton server to load all models...retrying in 1 second
I1023 01:24:34.833186 53 pinned_memory_manager.cc:277] "Pinned memory pool is created at '0x207e00000' with size 268435456"
I1023 01:24:34.840746 53 cuda_memory_manager.cc:107] "CUDA memory pool is created on device 0 with size 1000000000"
I1023 01:24:34.868128 53 model_lifecycle.cc:472] "loading: conformer-en-US-asr-streaming-asr-bls-ensemble:1"
I1023 01:24:34.868169 53 model_lifecycle.cc:472] "loading: riva-onnx-fastpitch_encoder-English-US:1"
I1023 01:24:34.868206 53 model_lifecycle.cc:472] "loading: riva-punctuation-en-US:1"
I1023 01:24:34.868226 53 model_lifecycle.cc:472] "loading: riva-trt-conformer-en-US-asr-streaming-am-streaming:1"
I1023 01:24:34.868246 53 model_lifecycle.cc:472] "loading: riva-trt-hifigan-English-US:1"
I1023 01:24:34.868262 53 model_lifecycle.cc:472] "loading: riva-trt-riva-punctuation-en-US-nn-bert-base-uncased:1"
I1023 01:24:34.868277 53 model_lifecycle.cc:472] "loading: spectrogram_chunker-English-US:1"
I1023 01:24:34.868292 53 model_lifecycle.cc:472] "loading: tts_postprocessor-English-US:1"
I1023 01:24:34.868315 53 model_lifecycle.cc:472] "loading: tts_preprocessor-English-US:1"
I1023 01:24:34.869711 53 onnxruntime.cc:2718] "TRITONBACKEND_Initialize: onnxruntime"
I1023 01:24:34.869739 53 onnxruntime.cc:2728] "Triton TRITONBACKEND API version: 1.19"
I1023 01:24:34.869745 53 onnxruntime.cc:2734] "'onnxruntime' TRITONBACKEND API version: 1.16"
I1023 01:24:34.869751 53 onnxruntime.cc:2764] "backend configuration:\n{\"cmdline\":{\"auto-complete-config\":\"false\",\"backend-directory\":\"/opt/tritonserver/backends\",\"min-compute-capability\":\"5.300000\",\"default-max-batch-size\":\"4\"}}"
E1023 01:24:34.886243 53 model_lifecycle.cc:642] "failed to load 'riva-trt-conformer-en-US-asr-streaming-am-streaming' version 1: Not found: unable to load shared library: /usr/lib/aarch64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.32' not found (required by /usr/lib/aarch64-linux-gnu/nvidia/libnvdla_runtime.so)"
Thks!
riva-start.log (51.0 KB)