Hello,
I use the following line (Gstreamer & deepstream) to convert my MJPEG camera input into RGB : "pipeline": "gst-launch-1.0 -v v4l2src device=/dev/video0 ! avdec_mjpeg ! videoconvert ! video/x-raw,format=RGB,height=480,framerate=30/1 ! appsink name = acquired_image "
Unfortunatly the process is Late causing 0.5s of latency between real world and isaac sight view.
Any idea on how I could make it real time ?
I am not too familiar with gstreamer, so I am not sure if the lag comes from there or if it comes from sight:
If you send too many images to sight, it’s possible it creates some lag, one way to improve performance is either to reduce the resolution when rendering or reduce the framerate. The isaac::viewers::ImageViewer let you change both (default value render the image in full resolution (reduce_scale == 1, you can change to 2) and the target framerate is 30Hz (target_fps == 30).
I would suggest you try to adjust those parameters to see if the issue comes from sight or from gstreamer.
If the issue is from gstreamer, I am not sure what can be done to improve performance, but I will ask around someone more knowledgeable.