You may try one of these:
1. Install v4l2loopback (check out v0.10.0).
Then use gstreamer for reading from rtspsrc, decoding and feeding your virtual camera (assuming here it is created as /dev/video1):
gst-launch-1.0 rtspsrc location=rtsp://192.168.0.12:8080/video/h264 latency=200 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=/dev/video1
Then access it as a V4l2 camera.
or
2. [EDIT June2020: @dusty_nv has recently added support for other video sources such as RTSP, so this dirty patch is obsolete.]
Modify jetson-utils. @Dusty_nv mentionned 2 PR. You may have a look to these.
Alternatetively, I did a quick and dirty patch for utils/camera (attached) that would try the passed string as user gst pipeline instead of failing when it does not have the camera expected format.
Then it is possible to use:
camera = jetson.utils.gstCamera(640, 480, "rtspsrc location=rtsp://127.0.0.1:8554/test latency=0 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=BGRx, width=640, height=480 ")
or
camera = jetson.utils.gstCamera(640, 480, "filesrc location=/opt/nvidia/deepstream/deepstream-4.0/samples/streams/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=BGRx, width=640, height=480 ")
Side note: if your IP camera has high resolution/framerate, you might have to increase kernel socket buffer max size on receiver side (jetson):
sudo sysctl -w net.core.rmem_max=26214400