Sending captured images to udp stream

Hello. I want to make a small change to detectnet.py file in jetson-inference. the output should be streamed on UDP instead of being displayed.

net = jetson.inference.detectNet(opt.network, sys.argv, opt.threshold)
input = jetson.utils.videoSource(opt.input_URI, argv=sys.argv)
while True:
	img = input.Capture()
	detections = net.Detect(img, overlay=opt.overlay)
    # task: sending img to udp stream. how to do it ?
    if not input.IsStreaming() or not output.IsStreaming():
       break

May I know which Jetson platform and JetPack version you used?

I am using using Jetson nano. cuda version is 10.02. and I think that Jetpack version is 4.5.1. ( in /etc/nv_tegra_release says that REVISION is 5.1)

You may try something like:

import jetson.inference
import jetson.utils

#net = jetson.inference.detectNet("ssd-mobilenet-v2", threshold=0.5)
camera = jetson.utils.videoSource("csi://0", argv=['--input-width=1920', '--input-height=1080', '--input-frameRate=30'])      # '/dev/video0' for V4L2
display = jetson.utils.videoOutput("rtp://127.0.0.1:5000", argv=['--output-width=1920', '--output-height=1080', '--output-frameRate=30', '--output-codec=h264']) 

while display.IsStreaming():
	img = camera.Capture()
	#detections = net.Detect(img)
	display.Render(img)
	#display.SetStatus("Object Detection | Network {:.0f} FPS".format(net.GetNetworkFPS()))

I have tested your solution. but unfortunately it does not work as I expected. I dont want to disply video stream on local machine. by the way the object detection process is necessary to be done.

What doesn’t work ?
If you don’t want the local display, try adding --headless.

You can also refer to this page for more streaming options: https://github.com/dusty-nv/jetson-inference/blob/master/docs/aux-streaming.md

As HP indicated, if you add --headless to the argv options, it won’t open display.

2 Likes