Triton server : dynamic config.pbtxt config file generation for an model

im using docker image triton-server-20.02 , is there a way to auto generate config.pbtxt file for any new model registry to triton server .

• DeepStream Version 6.1
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.01
ubuntu@ip-172-31-11-102:~$ nvidia-smi
Fri Apr 1 04:03:32 2022
±----------------------------------------------------------------------------+
| NVIDIA-SMI 510.47.03 Driver Version: 510.47.03 CUDA Version: 11.6 |
|-------------------------------±---------------------±---------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 Tesla T4 On | 00000000:00:1E.0 Off | 0 |
| N/A 28C P0 25W / 70W | 13531MiB / 15360MiB | 0% Default |
| | | N/A |
±------------------------------±---------------------±---------------------+

±----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 6150 C tritonserver 13445MiB |

What do you mean auto generate config.pbtxt?
You can see from folder samples/triton_model_repo/, based on model, there config.pbtxt under each model folder.

Hi @h9945394143 ,
DeepStream Triton docker does not support to generate the config file automatically.
But, with triton-server, you can use option - " --strict-model-config=false" to generate the config file, e.g.

docker run --gpus 1 --rm --shm-size=1g --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -p 8000:8000 -p 8001:8001 -p 8002:8002 -v $(pwd)/triton-yolov4-onnx-deploy/models:/models nvcr.io/nvidia/tritonserver:22.02-py3 tritonserver --model-repository=/models --strict-model-config=false --grpc-infer-allocation-pool-size=16 --log-verbose=1

BTW, we should have triton-server forum. :)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.