• Hardware Platform (Jetson / GPU) T4
• DeepStream Version 6.2
1、My tritonserver output is string , and inferserver of deepstream call my tritonserver showing kString is not supported. How to resolve this question?
2、c++ (float*) NvDsInferLayerInfo.buffer[i] output is not matched with python output np.array(result,dtype=np.float32) . How to resolve it ?
result = self.detect_frame(img,parameters)
result_np = np.array([str(result).encode(“utf-8”)], dtype=np.object_)
out_tensor_0 = pb_utils.Tensor(self.output_names, result_np)
inference_response = pb_utils.InferenceResponse(output_tensors=[out_tensor_0, ])
Above is the tritonserver code, and can I add the relavent kString code in NvDsInferDataType to achieve the fuction?
yes, nvinferserver plugin is opensource, you can have a try.
There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
could you share the model by forum’s private emai? thanks!
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.