I’m just trying to run a basic POC of custom backend based on the examples supplied with TRTIS.
Specifically, I’m trying to run Param backend: I’ve placed several prints into it for verifying its actually called and placed a .param model into /qa/L0_custom_backend/models/param folder.
The output I get when running trtserver with the path to the model folder is as folowes:
I0531 10:02:09.038232 16713 server.cc:112] Initializing TensorRT Inference Server
I0531 10:02:09.260641 16713 server_status.cc:55] New status tracking for model ‘param’
I0531 10:02:09.260754 16713 model_repository_manager.cc:675] loading: param:1
I0531 10:02:09.260897 16713 model_repository_manager.cc:829] successfully loaded ‘param’ version 1
Starting endpoints, ‘inference:0’ listening on
I0531 10:02:09.261790 16713 grpc_server.cc:1971] Started GRPCService at 0.0.0.0:8001
No of my prints appear.
What am I doing wrong !?