Triton server doesn't have both options to store the log file and redirecting logs to the console

I see that recently (23.01) triton server has provided option to store the logs. Currently triton server can either store the logs in a log file or stream the log to the console. But I have a requirement to store the log in a log file and stream the log to a console. Please let me know if you have any option to do that or please let me know if you have any plan to address this in the future If this option is not available currently.

Triton server does not support both storing the log in a file and stream the log to console.

Environment

Triton server 23.01:
GPU Type:
Nvidia Driver Version:
CUDA Version:
CUDNN Version:
Operating System + Version:
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

Hi,
We recommend you to raise this query in TRITON Inference Server Github instance issues section.

Thanks!