Error after inference

Please provide the following information when requesting support.

Hardware - GPU (A100/A30/T4/V100): RTX 3090
Hardware - CPU: 11th Intel i9
Operating System: Windows 11 + WSL2
Riva Version:2.8.0
TLT Version (if relevant)
How to reproduce the issue ? (This is for errors. Please share the command and the detailed log here)

Hi, I ran this inference code but it does shutdown the docker container soon after inference.

import io
import riva.client
import time

start_time = time.time()
auth = riva.client.Auth(uri=‘localhost:50051’)

riva_asr = riva.client.ASRService(auth)

path = “interview-base-audio.wav”
with io.open(path, ‘rb’) as fh:
content = fh.read()

config = riva.client.RecognitionConfig()
#req.config.encoding = ra.AudioEncoding.LINEAR_PCM # Audio encoding can be detected from wav
#req.config.sample_rate_hertz = 0 # Sample rate can be detected from wav and resampled if needed
config.language_code = “ko-KR” # Language code of the audio clip
config.max_alternatives = 1 # How many top-N hypotheses to return
config.enable_automatic_punctuation = True # Add punctuation when end of VAD detected
config.audio_channel_count = 1

response = riva_asr.offline_recognize(content, config)
asr_best_transcript = response.results[0].alternatives[0].transcript
print(“ASR Transcript:”, asr_best_transcript)

end_time = time.time()

print(“\n\nFull Response Message:”)
print(response)

elapsed_time = end_time - start_time
print(‘Execution time:’, elapsed_time, ‘seconds’)

After running the code, it returns the transcript correctly but the docker container always is stopped with an error.
Here is the error code from the docker.
BTW, I am using RIVA on WSL2 and tested the code in Windows 11.
log.txt (63.7 KB)

HI @sigmoidx

Thanks for your interest in Riva

Apologies we do not support running Docker with WSL2 support on Windows, Riva server requires Linux x86_64.

Reference
https://docs.nvidia.com/deeplearning/riva/user-guide/docs/support-matrix.html

Thanks

OK, it ran fine even after producing the correct output response. Then the docker container stopped. There might be a minor code change to fix this. I will test it on Ubuntu 22.04 again. Thanks,