I was able to stream it with - gst-launch-1.0 aravissrc ! video/x-raw, width=960, height=720, framerate=10/1 ! videoconvert ! xvimagesink
My Qs is -
It seems like gstCamera is using GStreamer to stream video and run inference on (could be wrong but…). And I wanted to see if there is a way for me to point jetson.utils.gstCamera to the command above.
Referencing these made me think it might be possible, but please let me know if I am way off… -
Thanks for sharing the post. I read that one as well, but maybe I didn’t understand it 100%.
So, I can actually do what [barzanhayati] did as well. I can use gst-launch-1.0 from terminal to stream but what I want to do is replace the source within detectnet-camera.py so I can use this specific camera. Which, I don’t think was discussed in the post you shared. Or did I miss that part?
Jetson camera utils expects either a CSI camera (NVCAMERA) or a USB camera (V4L2SRC). If the provided pipeline starts with something else it fails. My dirty patch allows, if CSI and USB camera source is not found, to try a generic gstreamer pipeline instead of failing. It would end with :
video/x-raw, format=BGRx, width=640, height=480
that is further transformed into RGB by jetson-utils.
So, for your case:
Apply the patch from an unmodified version of jetson-utils.
Build (make) and install (sudo make install) jetson-utils.
Then from your python code such as detectnet-camera.py you would use a custom pipeline such as: