Hello. I am an Aerospace Engineering student and I am working on my senior capstone project this semester. My team is developing a drone mounted sensor system that will collect river depth data and velocity of the flow at a cross section for the purpose of monitoring river discharge. We are using the Stereo Labs Zed 2 camera to capture the velocity of the river flow and we are planning on using the Jetson Nano to power and control the camera. We are looking to store the video footage on a micro SD card so that we can upload and post-process it later on the ground. I’ve been looking around and I’m unsure on how we’re going to do this. Is there a way to have the Jetson Nano save the camera video feed to the micro SD card? We will also be having a GNSS receiver on-board that will also need to save data to the micro SD card. Is it possible to use the Jetson Nano for the needs of the camera as well as the needs of the GNSS receiver?
Hi @jekn0229, I believe the ZED SDK allows you to save the video stream to a compressed file, which you can playback later. You would just save this file to your SD card. You probably would just want to get a fast SD card to alleviate any concerns (i.e. V30 speed class or above).
Note that if you weren’t using ZED (which has it’s own SDK that contains the recording functionality), the typical way to compress/save the video stream to disk would be with GStreamer. For more info, see the L4T Accelerated GStreamer Guide.
GNSS/GPS would be pretty low-bandwidth, so you should probably be able to log that simultaneously to a separate file on your SD card.
Are you Sure about using ZED 2 Camera , its currently not supported by Isaac SDK , i have a Nvidia RTX 3080 GPU and a ZEd 2 Camera .
I have so far tried and have failed to even get the Sample ZED_Camera app running , it fails miserably and i get a No GPU Compatible error.
Isaac SDK 2020.1 packs with it an outdated ZED Camera Sdk , which is useless for running ZED 2 !!
Hi @ishanbhatnagar6434, in the post above I was referring to using the ZED SDK directly. I am not sure of the latest status of ZED/ZED2 integration in Isaac SDK - I recommend that you post your question to the Isaac forums so our experts there can take a look:
Thanks! I have a few more questions concerning interfacing the camera with the Jetson Nano. I’ve been watching some youtube videos, but I haven’t found anything on using these exact two products together. I want to make sure that using the Jetson Nano with the Stereo Labs Zed 2 camera will be feasible. Does the Jetson Nano require wifi at all times during operation or is that just for the initial boot and for programming it? Will an additional fan be required if we are using the computer to initiate video recording and for storing that video data? Our team has to be able to justify using the Nano with the Zed 2 camera before we are actually able to get our hands on the hardware in order to test it and play around with it.
It only requires internet connection for when you want to install packages, download files, ect. After your device is setup, you can disconnect the network and deploy the device.
Not in typical ambient conditions, but if you are putting the device inside an enclosure or into a hot environment then it may be prudent to add a fan. These 5V PWM-controlled fans from Noctua are popular choices for the Nano.
Great, thanks. And in terms of programming the computer, how do we run programs in flight to control the camera? Our plan is to write scripts to interact with the camera in either C++ or Python. Initially we were planning on using an Arduino or similar microcontroller; however, those don’t interface with the Zed 2 camera, which is why we are now going with the Jetson Nano. I don’t have past experience working with computers like the Nano but I some experience working with Arduino. I’m wondering how different programing the Nano to perform tasks is compared to programming an Arduino. Again, thanks for your help.
The Jetson’s run Linux (Ubuntu), so you can program them with your programming language of choice like a typical Linux machine - as you mentioned, the most popular options are Python and C++.
Jetson has an integrated NVIDIA GPU, so you can also code in CUDA to write custom kernels that run on the GPU, but there are a lot of libraries/frameworks out there that already integrate CUDA so you don’t have to write it yourself (like TensorFlow, PyTorch, TensorRT, ect)
Awesome, thank you.
So say I have some programs that I want to run on the Jetson Nano in order to capture video data using my camera mounted on a drone, but I don’t want to be video taping the entire flight time. Is there a way I can signal the Jetson to run my program from the drone remote controller? Or do you have any suggestions for signaling the Jetson to run a program while it’s in flight? One idea we had was having a ground computer and SSH-ing onto the Jetson from that, but we’re not certain that we’ll have internet connection available.