I am building a Jetson-based system which will take input from a camera (via nvarguscamerasrc) and feed it to multiple outputs:
- stream via RTSP server
- record to local video and/or still
- go into OpenCV for processing
These tasks can start and stop independently, so I can’t seemingly use a tee to splice them all into one pipeline.
Right now, I use intervideosink and intervideosrc to connect multiple pipelines, and it works, but is very inefficient as intervideosrc/sink doesn’t work with NVMM memory.
Is there a way to accomplish this and keeping the data in NVMM memory as long as possible?
Some thoughts are:
- Is there an NVIDIA port of intervideosink/src that works on NVMM memory?
- Can a single camera be accessed simultaneously in different pipelines using nvarguscamerasrc?
- Is there some other approach that isn’t obvious?