I use the following line (Gstreamer & deepstream) to convert my MJPEG camera input into RGB :
"pipeline": "gst-launch-1.0 -v v4l2src device=/dev/video0 ! avdec_mjpeg ! videoconvert ! video/x-raw,format=RGB,height=480,framerate=30/1 ! appsink name = acquired_image "
Unfortunatly the process is Late causing 0.5s of latency between real world and isaac sight view.
Any idea on how I could make it real time ?