Riva 2.0 ASR not working

Please provide the following information when requesting support.

Hardware - RTX8000
Hardware - Intel(R) Xeon(R) Gold 5218R CPU @ 2.10GHz
Operating System- Ubuntu 18.04.5 LTS
Riva Version - Riva 2.0
How to reproduce the issue ? (This is for errors. Please share the command and the detailed log here)
Hi I managed to run ‘bash riva_start.sh’ and start the server but it seems that my ASR service is not working properly.
Checking the docker logs, my riva-server is missing a “ASR Server connected to Triton Inference Server at 0.0.0.0:8001” as per reference to the documentation
Was wondering if anyone is facing this issue?
Here are my docker logs and config.sh file for reference
Docker logs :

==========================
=== Riva Speech Skills ===

NVIDIA Release 22.03 (build 35451687)
Riva Speech Server Version 2.0.0

Copyright (c) 2018-2022, NVIDIA CORPORATION & AFFILIATES. All rights reserved.

Various files include modifications (c) NVIDIA CORPORATION & AFFILIATES. All rights reserved.

This container image and its contents are governed by the NVIDIA Deep Learning Container License.
By pulling and using the container, you accept the terms and conditions of this license:

NOTE: CUDA Forward Compatibility mode ENABLED.
Using CUDA 11.6 driver version 510.39.01 with kernel driver version 470.57.02.
See CUDA Compatibility :: NVIDIA Data Center GPU Driver Documentation for details.

NOTE: The SHMEM allocation limit is set to the default of 64MB. This may be
insufficient for Riva Speech Server. NVIDIA recommends the use of the following flags:
docker run --gpus all --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 …

Riva waiting for Triton server to load all models…retrying in 1 second
I0412 02:25:00.611239 117 onnxruntime.cc:2319] TRITONBACKEND_Initialize: onnxruntime
I0412 02:25:00.611332 117 onnxruntime.cc:2329] Triton TRITONBACKEND API version: 1.8
I0412 02:25:00.611341 117 onnxruntime.cc:2335] ‘onnxruntime’ TRITONBACKEND API version: 1.8
I0412 02:25:00.611349 117 onnxruntime.cc:2365] backend configuration:
{}
I0412 02:25:00.869161 117 pinned_memory_manager.cc:240] Pinned memory pool is created at ‘0x7f85ea000000’ with size 268435456
I0412 02:25:00.869614 117 cuda_memory_manager.cc:105] CUDA memory pool is created on device 0 with size 1000000000
I0412 02:25:00.873958 117 model_repository_manager.cc:994] loading: new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming:1
I0412 02:25:00.974597 117 model_repository_manager.cc:994] loading: new_citrinet-1024-english-asr-streaming-feature-extractor-streaming:1
I0412 02:25:01.034181 117 ctc-decoder-library.cc:20] TRITONBACKEND_ModelInitialize: new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming (version 1)
W:parameter_parser.cc:118: Parameter forerunner_start_offset_ms could not be set from parameters
W:parameter_parser.cc:119: Default value will be used
W:parameter_parser.cc:118: Parameter forerunner_start_offset_ms could not be set from parameters
W:parameter_parser.cc:119: Default value will be used
W:parameter_parser.cc:118: Parameter max_num_slots could not be set from parameters
W:parameter_parser.cc:119: Default value will be used
I0412 02:25:01.035982 117 backend_model.cc:255] model configuration:
{
“name”: “new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming”,
“platform”: “”,
“backend”: “riva_asr_decoder”,
“version_policy”: {
“latest”: {
“num_versions”: 1
}
},
“max_batch_size”: 2048,
“input”: [
{
“name”: “CLASS_LOGITS”,
“data_type”: “TYPE_FP32”,
“format”: “FORMAT_NONE”,
“dims”: [
-1,
1025
],
“is_shape_tensor”: false,
“allow_ragged_batch”: false,
“optional”: false
},
{
“name”: “END_FLAG”,
“data_type”: “TYPE_UINT32”,
“format”: “FORMAT_NONE”,
“dims”: [
1
],
“is_shape_tensor”: false,
“allow_ragged_batch”: false,
“optional”: false
},
{
“name”: “SEGMENTS_START_END”,
“data_type”: “TYPE_INT32”,
“format”: “FORMAT_NONE”,
“dims”: [
-1,
2
],
“is_shape_tensor”: false,
“allow_ragged_batch”: false,
“optional”: false
},
{
“name”: “CUSTOM_CONFIGURATION”,
“data_type”: “TYPE_STRING”,
“format”: “FORMAT_NONE”,
“dims”: [
-1,
2
],
“is_shape_tensor”: false,
“allow_ragged_batch”: false,
“optional”: false
}
],
“output”: [
{
“name”: “FINAL_TRANSCRIPTS”,
“data_type”: “TYPE_STRING”,
“dims”: [
-1
],
“label_filename”: “”,
“is_shape_tensor”: false
},
{
“name”: “FINAL_TRANSCRIPTS_SCORE”,
“data_type”: “TYPE_FP32”,
“dims”: [
-1
],
“label_filename”: “”,
“is_shape_tensor”: false
},
{
“name”: “FINAL_WORDS_START_END”,
“data_type”: “TYPE_INT32”,
“dims”: [
-1,
2
],
“label_filename”: “”,
“is_shape_tensor”: false
},
{
“name”: “PARTIAL_TRANSCRIPTS”,
“data_type”: “TYPE_STRING”,
“dims”: [
-1
],
“label_filename”: “”,
“is_shape_tensor”: false
},
{
“name”: “PARTIAL_TRANSCRIPTS_STABILITY”,
“data_type”: “TYPE_FP32”,
“dims”: [
-1
],
“label_filename”: “”,
“is_shape_tensor”: false
},
{
“name”: “PARTIAL_WORDS_START_END”,
“data_type”: “TYPE_INT32”,
“dims”: [
-1,
2
],
“label_filename”: “”,
“is_shape_tensor”: false
}
],
“batch_input”: ,
“batch_output”: ,
“optimization”: {
“priority”: “PRIORITY_DEFAULT”,
“cuda”: {
“graphs”: false,
“busy_wait_events”: false,
“graph_spec”: ,
“output_copy_stream”: true
},
“input_pinned_memory”: {
“enable”: true
},
“output_pinned_memory”: {
“enable”: true
},
“gather_kernel_buffer_threshold”: 0,
“eager_batching”: false
},
“sequence_batching”: {
“oldest”: {
“max_candidate_sequences”: 2048,
“preferred_batch_size”: [
32,
64
],
“max_queue_delay_microseconds”: 1000
},
“max_sequence_idle_microseconds”: 60000000,
“control_input”: [
{
“name”: “START”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_START”,
“int32_false_true”: [
0,
1
],
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_INVALID”
}
]
},
{
“name”: “READY”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_READY”,
“int32_false_true”: [
0,
1
],
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_INVALID”
}
]
},
{
“name”: “END”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_END”,
“int32_false_true”: [
0,
1
],
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_INVALID”
}
]
},
{
“name”: “CORRID”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_CORRID”,
“int32_false_true”: ,
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_UINT64”
}
]
}
],
“state”:
},
“instance_group”: [
{
“name”: “new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming_0”,
“kind”: “KIND_CPU”,
“count”: 1,
“gpus”: ,
“secondary_devices”: ,
“profile”: ,
“passive”: false,
“host_policy”: “”
}
],
“default_model_filename”: “”,
“cc_model_filenames”: {},
“metric_tags”: {},
“parameters”: {
“use_vad”: {
“string_value”: “True”
},
“lm_weight”: {
“string_value”: “0.2”
},
“blank_token”: {
“string_value”: “#”
},
“vocab_file”: {
“string_value”: “/data/models/new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming/1/riva_decoder_vocabulary.txt”
},
“ms_per_timestep”: {
“string_value”: “80”
},
“use_subword”: {
“string_value”: “True”
},
“streaming”: {
“string_value”: “True”
},
“beam_size”: {
“string_value”: “16”
},
“right_padding_size”: {
“string_value”: “1.6”
},
“beam_size_token”: {
“string_value”: “16”
},
“sil_token”: {
“string_value”: “▁”
},
“num_tokenization”: {
“string_value”: “1”
},
“beam_threshold”: {
“string_value”: “20.0”
},
“language_model_file”: {
“string_value”: “/data/models/new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming/1/kenlm_bpe.model”
},
“tokenizer_model”: {
“string_value”: “/data/models/new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming/1/eddec70471334da59f9bd3e3bba4d0cb_tokenizer.model”
},
“max_execution_batch_size”: {
“string_value”: “1024”
},
“forerunner_use_lm”: {
“string_value”: “true”
},
“forerunner_beam_size_token”: {
“string_value”: “8”
},
“forerunner_beam_threshold”: {
“string_value”: “10.0”
},
“asr_model_delay”: {
“string_value”: “-1”
},
“decoder_num_worker_threads”: {
“string_value”: “-1”
},
“word_insertion_score”: {
“string_value”: “0.2”
},
“left_padding_size”: {
“string_value”: “1.6”
},
“decoder_type”: {
“string_value”: “flashlight”
},
“forerunner_beam_size”: {
“string_value”: “8”
},
“max_supported_transcripts”: {
“string_value”: “1”
},
“chunk_size”: {
“string_value”: “0.8”
},
“lexicon_file”: {
“string_value”: “/data/models/new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming/1/lexicon.txt”
},
“smearing_mode”: {
“string_value”: “max”
}
},
“model_warmup”: ,
“model_transaction_policy”: {
“decoupled”: false
}
}
I0412 02:25:01.036334 117 ctc-decoder-library.cc:23] TRITONBACKEND_ModelInstanceInitialize: new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming_0 (device 0)
I0412 02:25:01.075280 117 model_repository_manager.cc:994] loading: new_citrinet-1024-english-asr-streaming-voice-activity-detector-ctc-streaming:1
I0412 02:25:01.081108 117 model_repository_manager.cc:1149] successfully loaded ‘new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming’ version 1
I0412 02:25:01.081550 117 feature-extractor.cc:402] TRITONBACKEND_ModelInitialize: new_citrinet-1024-english-asr-streaming-feature-extractor-streaming (version 1)
I0412 02:25:01.102795 117 backend_model.cc:255] model configuration:
{
“name”: “new_citrinet-1024-english-asr-streaming-feature-extractor-streaming”,
“platform”: “”,
“backend”: “riva_asr_features”,
“version_policy”: {
“latest”: {
“num_versions”: 1
}
},
“max_batch_size”: 2048,
“input”: [
{
“name”: “AUDIO_SIGNAL”,
“data_type”: “TYPE_FP32”,
“format”: “FORMAT_NONE”,
“dims”: [
-1
],
“is_shape_tensor”: false,
“allow_ragged_batch”: false,
“optional”: false
},
{
“name”: “SAMPLE_RATE”,
“data_type”: “TYPE_UINT32”,
“format”: “FORMAT_NONE”,
“dims”: [
1
],
“is_shape_tensor”: false,
“allow_ragged_batch”: false,
“optional”: false
}
],
“output”: [
{
“name”: “AUDIO_FEATURES”,
“data_type”: “TYPE_FP32”,
“dims”: [
80,
-1
],
“label_filename”: “”,
“is_shape_tensor”: false
},
{
“name”: “AUDIO_PROCESSED”,
“data_type”: “TYPE_FP32”,
“dims”: [
1
],
“label_filename”: “”,
“is_shape_tensor”: false
},
{
“name”: “AUDIO_FEATURES_LENGTH”,
“data_type”: “TYPE_INT64”,
“dims”: [
1
],
“label_filename”: “”,
“is_shape_tensor”: false
}
],
“batch_input”: ,
“batch_output”: ,
“optimization”: {
“priority”: “PRIORITY_DEFAULT”,
“cuda”: {
“graphs”: false,
“busy_wait_events”: false,
“graph_spec”: ,
“output_copy_stream”: true
},
“input_pinned_memory”: {
“enable”: true
},
“output_pinned_memory”: {
“enable”: true
},
“gather_kernel_buffer_threshold”: 0,
“eager_batching”: false
},
“sequence_batching”: {
“oldest”: {
“max_candidate_sequences”: 2048,
“preferred_batch_size”: [
256,
512
],
“max_queue_delay_microseconds”: 1000
},
“max_sequence_idle_microseconds”: 60000000,
“control_input”: [
{
“name”: “START”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_START”,
“int32_false_true”: [
0,
1
],
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_INVALID”
}
]
},
{
“name”: “READY”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_READY”,
“int32_false_true”: [
0,
1
],
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_INVALID”
}
]
},
{
“name”: “END”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_END”,
“int32_false_true”: [
0,
1
],
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_INVALID”
}
]
},
{
“name”: “CORRID”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_CORRID”,
“int32_false_true”: ,
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_UINT64”
}
]
}
],
“state”:
},
“instance_group”: [
{
“name”: “new_citrinet-1024-english-asr-streaming-feature-extractor-streaming_0”,
“kind”: “KIND_GPU”,
“count”: 1,
“gpus”: [
0
],
“secondary_devices”: ,
“profile”: ,
“passive”: false,
“host_policy”: “”
}
],
“default_model_filename”: “”,
“cc_model_filenames”: {},
“metric_tags”: {},
“parameters”: {
“right_padding_size”: {
“string_value”: “1.6”
},
“gain”: {
“string_value”: “1.0”
},
“use_utterance_norm_params”: {
“string_value”: “False”
},
“precalc_norm_time_steps”: {
“string_value”: “0”
},
“precalc_norm_params”: {
“string_value”: “False”
},
“dither”: {
“string_value”: “1e-05”
},
“norm_per_feature”: {
“string_value”: “True”
},
“mean”: {
“string_value”: “-11.4412, -9.9334, -9.1292, -9.0365, -9.2804, -9.5643, -9.7342, -9.6925, -9.6333, -9.2808, -9.1887, -9.1422, -9.1397, -9.2028, -9.2749, -9.4776, -9.9185, -10.1557, -10.3800, -10.5067, -10.3190, -10.4728, -10.5529, -10.6402, -10.6440, -10.5113, -10.7395, -10.7870, -10.6074, -10.5033, -10.8278, -10.6384, -10.8481, -10.6875, -10.5454, -10.4747, -10.5165, -10.4930, -10.3413, -10.3472, -10.3735, -10.6830, -10.8813, -10.6338, -10.3856, -10.7727, -10.8957, -10.8068, -10.7373, -10.6108, -10.3405, -10.2889, -10.3922, -10.4946, -10.3367, -10.4164, -10.9949, -10.7196, -10.3971, -10.1734, -9.9257, -9.6557, -9.1761, -9.6653, -9.7876, -9.7230, -9.7792, -9.7056, -9.2702, -9.4650, -9.2755, -9.1369, -9.1174, -8.9197, -8.5394, -8.2614, -8.1353, -8.1422, -8.3430, -8.6655”
},
“stddev”: {
“string_value”: “2.2668, 3.1642, 3.7079, 3.7642, 3.5349, 3.5901, 3.7640, 3.8424, 4.0145, 4.1475, 4.0457, 3.9048, 3.7709, 3.6117, 3.3188, 3.1489, 3.0615, 3.0362, 2.9929, 3.0500, 3.0341, 3.0484, 3.0103, 2.9474, 2.9128, 2.8669, 2.8332, 2.9411, 3.0378, 3.0712, 3.0190, 2.9992, 3.0124, 3.0024, 3.0275, 3.0870, 3.0656, 3.0142, 3.0493, 3.1373, 3.1135, 3.0675, 2.8828, 2.7018, 2.6296, 2.8826, 2.9325, 2.9288, 2.9271, 2.9890, 3.0137, 2.9855, 3.0839, 2.9319, 2.3512, 2.3795, 2.6191, 2.7555, 2.9326, 2.9931, 3.1543, 3.0855, 2.6820, 3.0566, 3.1272, 3.1663, 3.1836, 3.0018, 2.9089, 3.1727, 3.1626, 3.1086, 2.9804, 3.1107, 3.2998, 3.3697, 3.3716, 3.2487, 3.1597, 3.1181”
},
“chunk_size”: {
“string_value”: “0.8”
},
“max_execution_batch_size”: {
“string_value”: “1024”
},
“sample_rate”: {
“string_value”: “16000”
},
“window_stride”: {
“string_value”: “0.01”
},
“window_size”: {
“string_value”: “0.025”
},
“num_features”: {
“string_value”: “80”
},
“streaming”: {
“string_value”: “True”
},
“left_padding_size”: {
“string_value”: “1.6”
},
“transpose”: {
“string_value”: “False”
},
“stddev_floor”: {
“string_value”: “1e-05”
}
},
“model_warmup”: ,
“model_transaction_policy”: {
“decoupled”: false
}
}
I0412 02:25:01.103110 117 feature-extractor.cc:404] TRITONBACKEND_ModelInstanceInitialize: new_citrinet-1024-english-asr-streaming-feature-extractor-streaming_0 (device 0)
I0412 02:25:01.175706 117 model_repository_manager.cc:994] loading: riva-onnx-new_citrinet-1024-english-asr-streaming-am-streaming:1
Riva waiting for Triton server to load all models…retrying in 1 second
I0412 02:25:01.659412 117 model_repository_manager.cc:1149] successfully loaded ‘new_citrinet-1024-english-asr-streaming-feature-extractor-streaming’ version 1
I0412 02:25:01.660359 117 vad_library.cc:18] TRITONBACKEND_ModelInitialize: new_citrinet-1024-english-asr-streaming-voice-activity-detector-ctc-streaming (version 1)
W:parameter_parser.cc:118: Parameter max_execution_batch_size could not be set from parameters
W:parameter_parser.cc:119: Default value will be used
W:parameter_parser.cc:118: Parameter max_execution_batch_size could not be set from parameters
W:parameter_parser.cc:119: Default value will be used
I0412 02:25:01.664076 117 backend_model.cc:255] model configuration:
{
“name”: “new_citrinet-1024-english-asr-streaming-voice-activity-detector-ctc-streaming”,
“platform”: “”,
“backend”: “riva_asr_vad”,
“version_policy”: {
“latest”: {
“num_versions”: 1
}
},
“max_batch_size”: 2048,
“input”: [
{
“name”: “CLASS_LOGITS”,
“data_type”: “TYPE_FP32”,
“format”: “FORMAT_NONE”,
“dims”: [
-1,
1025
],
“is_shape_tensor”: false,
“allow_ragged_batch”: false,
“optional”: false
}
],
“output”: [
{
“name”: “SEGMENTS_START_END”,
“data_type”: “TYPE_INT32”,
“dims”: [
-1,
2
],
“label_filename”: “”,
“is_shape_tensor”: false
}
],
“batch_input”: ,
“batch_output”: ,
“optimization”: {
“priority”: “PRIORITY_DEFAULT”,
“cuda”: {
“graphs”: false,
“busy_wait_events”: false,
“graph_spec”: ,
“output_copy_stream”: true
},
“input_pinned_memory”: {
“enable”: true
},
“output_pinned_memory”: {
“enable”: true
},
“gather_kernel_buffer_threshold”: 0,
“eager_batching”: false
},
“sequence_batching”: {
“max_sequence_idle_microseconds”: 60000000,
“control_input”: [
{
“name”: “START”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_START”,
“int32_false_true”: [
0,
1
],
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_INVALID”
}
]
},
{
“name”: “READY”,
“control”: [
{
“kind”: “CONTROL_SEQUENCE_READY”,
“int32_false_true”: [
0,
1
],
“fp32_false_true”: ,
“bool_false_true”: ,
“data_type”: “TYPE_INVALID”
}
]
}
],
“state”:
},
“instance_group”: [
{
“name”: “new_citrinet-1024-english-asr-streaming-voice-activity-detector-ctc-streaming_0”,
“kind”: “KIND_CPU”,
“count”: 1,
“gpus”: ,
“secondary_devices”: ,
“profile”: ,
“passive”: false,
“host_policy”: “”
}
],
“default_model_filename”: “”,
“cc_model_filenames”: {},
“metric_tags”: {},
“parameters”: {
“streaming”: {
“string_value”: “True”
},
“use_subword”: {
“string_value”: “True”
},
“residue_blanks_at_end”: {
“string_value”: “0”
},
“vad_stop_history”: {
“string_value”: “800”
},
“vad_start_history”: {
“string_value”: “300”
},
“chunk_size”: {
“string_value”: “0.8”
},
“vad_start_th”: {
“string_value”: “0.2”
},
“vad_stop_th”: {
“string_value”: “0.98”
},
“vad_type”: {
“string_value”: “ctc-vad”
},
“vocab_file”: {
“string_value”: “/data/models/new_citrinet-1024-english-asr-streaming-voice-activity-detector-ctc-streaming/1/riva_decoder_vocabulary.txt”
},
“residue_blanks_at_start”: {
“string_value”: “-2”
},
“ms_per_timestep”: {
“string_value”: “80”
}
},
“model_warmup”: ,
“model_transaction_policy”: {
“decoupled”: false
}
}
I0412 02:25:01.664226 117 onnxruntime.cc:2400] TRITONBACKEND_ModelInitialize: riva-onnx-new_citrinet-1024-english-asr-streaming-am-streaming (version 1)
I0412 02:25:01.665811 117 vad_library.cc:21] TRITONBACKEND_ModelInstanceInitialize: new_citrinet-1024-english-asr-streaming-voice-activity-detector-ctc-streaming_0 (device 0)
I0412 02:25:01.797288 117 onnxruntime.cc:2443] TRITONBACKEND_ModelInstanceInitialize: riva-onnx-new_citrinet-1024-english-asr-streaming-am-streaming_0 (GPU device 0)
I0412 02:25:01.798408 117 model_repository_manager.cc:1149] successfully loaded ‘new_citrinet-1024-english-asr-streaming-voice-activity-detector-ctc-streaming’ version 1
I0412 02:25:02.592415 117 model_repository_manager.cc:1149] successfully loaded ‘riva-onnx-new_citrinet-1024-english-asr-streaming-am-streaming’ version 1
I0412 02:25:02.593883 117 model_repository_manager.cc:994] loading: new_citrinet-1024-english-asr-streaming:1
Riva waiting for Triton server to load all models…retrying in 1 second
I0412 02:25:02.694849 117 model_repository_manager.cc:1149] successfully loaded ‘new_citrinet-1024-english-asr-streaming’ version 1
I0412 02:25:02.695078 117 server.cc:522]
±-----------------±-----+
| Repository Agent | Path |
±-----------------±-----+
±-----------------±-----+

I0412 02:25:02.695233 117 server.cc:549]
±------------------±----------------------------------------------------------------------------±-------+
| Backend | Path | Config |
±------------------±----------------------------------------------------------------------------±-------+
| onnxruntime | /opt/tritonserver/backends/onnxruntime/libtriton_onnxruntime.so | {} |
| riva_asr_features | /opt/tritonserver/backends/riva_asr_features/libtriton_riva_asr_features.so | {} |
| riva_asr_decoder | /opt/tritonserver/backends/riva_asr_decoder/libtriton_riva_asr_decoder.so | {} |
| riva_asr_vad | /opt/tritonserver/backends/riva_asr_vad/libtriton_riva_asr_vad.so | {} |
±------------------±----------------------------------------------------------------------------±-------+

I0412 02:25:02.695399 117 server.cc:592]
±------------------------------------------------------------------------------±--------±-------+
| Model | Version | Status |
±------------------------------------------------------------------------------±--------±-------+
| new_citrinet-1024-english-asr-streaming | 1 | READY |
| new_citrinet-1024-english-asr-streaming-ctc-decoder-cpu-streaming | 1 | READY |
| new_citrinet-1024-english-asr-streaming-feature-extractor-streaming | 1 | READY |
| new_citrinet-1024-english-asr-streaming-voice-activity-detector-ctc-streaming | 1 | READY |
| riva-onnx-new_citrinet-1024-english-asr-streaming-am-streaming | 1 | READY |
±------------------------------------------------------------------------------±--------±-------+

I0412 02:25:02.767614 117 metrics.cc:623] Collecting metrics for GPU 0: Quadro RTX 8000
I0412 02:25:02.768009 117 tritonserver.cc:1932]
±---------------------------------±---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Option | Value |
±---------------------------------±---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| server_id | triton |
| server_version | 2.19.0 |
| server_extensions | classification sequence model_repository model_repository(unload_dependents) schedule_policy model_configuration system_shared_memory cuda_shared_memory binary_tensor_data statistics trace |
| model_repository_path[0] | /data/models |
| model_control_mode | MODE_NONE |
| strict_model_config | 1 |
| rate_limit | OFF |
| pinned_memory_pool_byte_size | 268435456 |
| cuda_memory_pool_byte_size{0} | 1000000000 |
| response_cache_byte_size | 0 |
| min_supported_compute_capability | 6.0 |
| strict_readiness | 1 |
| exit_timeout | 30 |
±---------------------------------±---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I0412 02:25:02.770265 117 grpc_server.cc:4375] Started GRPCInferenceService at 0.0.0.0:8001
I0412 02:25:02.770500 117 http_server.cc:3075] Started HTTPService at 0.0.0.0:8000
I0412 02:25:02.813146 117 http_server.cc:178] Started Metrics Service at 0.0.0.0:8002

Triton server is ready…
I0412 02:25:03.700126 231 riva_server.cc:118] Using Insecure Server Credentials
I0412 02:25:03.711129 231 model_registry.cc:112] Successfully registered: new_citrinet-1024-english-asr-streaming for ASR
W0412 02:25:03.747020 231 grpc_riva_asr.cc:188] new_citrinet-1024-english-asr-streaming has no configured wfst normalizer model
I0412 02:25:03.787667 231 riva_server.cc:158] Riva Conversational AI Server listening on 0.0.0.0:50051
W0412 02:25:03.787698 231 stats_reporter.cc:40] No API key provided. Stats reporting disabled.

Config.sh file :
config.sh (10.4 KB)

Hi @200857g

Thanks for your interest in Riva,

Thanks for sharing the log and config

There has been slight change in the docker log output mentioned in the doc (Thanks for your info, we will update),
The output log you have shared seems to have the ASR loaded, so it should work fine,

I have issue and i get Error: Model is not available on server
What can i do?