DeepStream 5.0 nvinferserver how to use upstream tensor meta as a model input

Thank you for your answer, but this is not a solution I’m looking for.

I had a high hope since DeepStream 5.0 Developer Preview announced to integrate Triton Inference Server into DeepStream because it makes DeepStream so much more flexible.

But it seem that nvinferserver element disappointed my expectation when it only support Detection, Classification and Segmentation. There are so many type of machine learning model that doesn’t fit into these 3 types of problems.
I still hope to get a feedback from Nvidia.