I’m working with DeepStream and the nvinferserver
plugin, integrating with Triton Inference Server. I’m using a custom config.pbtxt
file for my model configuration, similar to the example below:
name: "pose_classification_tao"
platform: "tensorrt_plan"
max_batch_size: 16
input [
{
name: "input"
data_type: TYPE_FP32
dims: [ 3, 300, 34, 1 ]
}
]
output [
{
name: "fc_pred"
data_type: TYPE_FP32
dims: [ 6 ]
label_filename: "labels.txt"
}
]
dynamic_batching { }
My question is:
Is there an official and complete template or documentation available for the config.pbtxt
format used with the nvinferserver
plugin? Specifically, I’m looking for a reference similar to the DeepStream documentation for GStreamer plugin properties, such as this page:
https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvinferserver.html#gst-properties
I’m hoping to find:
- All supported parameters in the
config.pbtxt
- Accepted values and data types
- Clear descriptions of how each parameter influences model execution or plugin behavior
- Any DeepStream-specific extensions, constraints, or best practices
Current examples and guides often cover only partial use cases. A full reference or schema would be very helpful for customizing configurations and troubleshooting integration with complex models.