I’m new to Deepstream and would like to know what the best way would be to accomodate that. In particular, I don"t know how to recompile an app and what I would need to change where. Could you help me?
Ok, I found a satisfactory solution for me using deepstream_python_apps (1.1.6). Modifying the deepstream-test1-usb example to read from uri and adjusting some gstreamer settings (e.g. setting decoder.set_property(‘mjpeg’, 1) and more).
By distance I mean a distance value coming from a custom neural network (monocular depth estimation, but for each bounding box). I already have the network and I think I know how to adapt the pt/onnx to get the engine with distances. It’s just that I don’t know how to implement a “custom” tracker. I would be happy with a basic tracker that just passes the distance values along so I can map them appropriately.
The problem with RTSP sink is (1) web does not like RTSP and (2) I would like to have perfectly synked bbox data for each frame (ideally an MJPEG stream with each frame having a bbox package). I want to do custom rendering with the bounding boxes in the frontend. Could you point me to what I should do in that case? Should I rather
Ok thanks, I’ll give that a try. And also the appsink through socket. Could you maybe point me to some resource how I can fetch the bbox data to grab to ingest into a sink?
You need to receive the message on your client and then match the source id with the stream. Please refer to our Guide to learn the format of the message.
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks