• Hardware Platform (Jetson / GPU) : jetson
• DeepStream Version : 5.0 GA
• JetPack Version (valid for Jetson only):4.4
• TensorRT Version:7.1
I trained an face recognition model with Tensorflow and converted to Tensorflow-TensorRT, I want to add this model to deepstream,
1- Is it possible to load converted model with triton-server plugin?
2- If so, the last output of model is 128-d tensort, Is it possible to pass the last tensor output to next element of gstreamer pipeline?
3- Is there another solution to use a custom TensorFlow model in deepstream pipeline