Turning Jetson Nano to an interactive audiovisual instrument - DeepStream <3 OpenSoundControl

DeepStream ❤️ OSC

Hello everyone! I want to share a project I have been working on for the last months.

I am using DeepStream SDK for Jetson Nano as an instrument to sonify and visualize detected objects in real time. My idea to do this project was to turn public spaces into interactive-playable places where I can use people or vehicles as input to make performances or installations.

I modified two of the python app examples for DeepStream 5, to add the functionality of sending OSC (Open Sound Control protocol) messages of detected object infos to any other computer in the local network.

Jetson Nano broadcasts OSC messages from port 4545. The messages sent are:

/frame_number - Current frame number

/oxywhc - This message is sent for each object detected in the frame.

[ object no, x coordinate, y coordinate, width, height, class name ]

/num_rects - Total number of objects detected in the frame

/num_vehicles - Number of vehicles detected in the frame

/num_people - Number of people detected in the frame

Any software that accepts OSC as input can use this data to control their parameters. It can be sound / visual programming frameworks, videogames / emulators, whatever you can imagine. It is also possible to translate OSC to HID or MIDI messages to extend the amount of softwares DeepStream can communicate with.

Until now I used Supercollider to make audio, and Touchdesigner for visuals.

In Supercollider, I assigned control busses to store the incoming data, and normalized coordinate number values between [0-1]. Then I took “live coding” approach to make simple synth recipes that can be changed on the fly. The control busses carrying the DeepStream data are mapped to various parameters of synths, effects, manipulating the controls in real time.

DeepStream Loves OSC demo video w/ Found Video Material
DeepStream <3 OSC first tests

In Touchdesigner, I wrote some python code to generate rectangle masks of detected objects to display.

I used it as a part of live visuals for my album premiere livestream on Twitch at RGB Dog’s BedroomNightOut event: https://www.youtube.com/watch?v=hL9QXTFQdi4

From this github link you can view the modified example files, Supercollider files for setup and synth examples, and Touchdesigner .tox file.

1 Like