Executing two inference video streams

I am running a python program (shown below) on a Jetson Nano 2gb som with Camera #1 (Rasp Pi) designated as csi://0, Camera 2 (C920 WebCam) designated as /dev/video1.

executing the code the webcam video stream is working correctly,

*however the PI Cam video flashes (at a fast rate) and the video stream is present but extremely dark.

Can you recommend a more elegant way of combining 2 video inference streams that works effectively displaying 2 video streams simultaneously within one display screen…

Please Note CPU Utilization is CPU1: 36%, CPU2: 57%, CPU3: 51% and CPU4: 34% and memory Allocated is 1.8 gb out of 1.9 gb or 90% used.

The code is shown below:

from jetson_inference import detectNet
from jetson_utils import videoSource, videoOutput

net = detectNet(“ssd-mobilenet-v2”, threshold=0.5)

camera = videoSource(“csi://0”) # 0 for pi cam
camera1 = videoSource(‘/dev/video1’) #1 for webcam,

display = videoOutput(“display://0”, ) # ‘my_video.mp4’ for fi

while display.IsStreaming():

img = camera.Capture()   # Pi cam 
img1 = camera1.Capture() # Web Cam 

if ((img is None) | (img1 is None)): # capture timeout
    continue

detections = net.Detect(img)  # Pi Cam 
detections = net.Detect(img1) # web Cam 

display.Render(img)
display.Render(img1)
display.SetStatus("Pi and Web Cam Object Detection | Network {:.0f} FPS".format(net.GetNetworkFPS()))

If you want an efficient and fully customizable way to do this, you could consider using NVIDIA’s Deepstream DeepStream Reference Application - deepstream-app — DeepStream 6.0 Release documentation

By using GStreamer you can combine multiple streams and batch them using the nvstreammux element for more efficient inferences, also there are a lot of models provided by nvidia

Regards,
Allan Navarro
Embedded SW Engineer at RidgeRun

Contact us: support@ridgerun.com
Developers wiki: https://developer.ridgerun.com/
Website: www.ridgerun.com

allan thanks for the suggestions

bob

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.