[SOLVED] How do I test my IMX219 video feed on the Jetson Nano?

The documentation for the Jetson Nano on the open internet is very sparse. Given that it’s less than a year old, I’m not terribly surprised, so I figure I’ll help get the ball rolling on what appears to be a hot topic that many people want to know, including me.

How the bloody ---- do you get to see the video feed on the IMX219 running on the Jetson Nano? :)

I’ve played with gst-launch-1.0 commands and I don’t understand it. Plus, it doesn’t seem to have lots of clear documentation on that either. What’s Argus? What’s a video sink? What all goes into the GStreamer pipeline? All I want to do is run a command through (presumably) SSH and have the Nano start streaming to something on my dev machine so I can verify that the camera works, and have a point of reference for when I start learning how to work with camera-based AI.

I tried running nvgstcapture and it starts, but it doesn’t open a feed on my local machine.

Any ideas?

P.S. FWIW, /dev/video0 is in place.

Did you connect display? Those command need to show the camera preview on the display screen.
You may get more information about gstreamer and nvidia camera framework from below link.




Thank you. I’ll look through those tomorrow if I get a chance.

When you asked if I connected my display, do you mean on the Nano? My assumption was that I could stream it over Wifi to something listening on my dev machine. If that’s not the case, then will temporarily connecting a mouse, keyboard, and monitor to the Nano have any drawbacks? (Ex. Always launching xserver at startup, thus potentially decreasing system resources even when headless)

It’s better have HDMI connect to Nano for camera preview test.

I tested both commands through the Nano and they both work. The one for GST Streamer opens in its own window, whilst the standalone nvgstcapture takes over the desktop. The latter passes through the interaction to whatever is under it, so you can access the terminal and hit ctrl+c to stop it.

I modified the gst-streamer command a bit though to drop the -e flag which otherwise causes it to hang on shutdown. (EOS I think is “End of Stream” just like EOF is “End of File”.) So, in other words, whatever is supposed to tell it that the stream ended, does not tell it the stream ended, and thus, it hangs on the IMX219 with the -e flag.

As for my concerns about using the desktop environment and then switching back to headless, I don’t see any issues in doing that now, as top shows the Nano as EXTREMELY efficient. No desktop environment is running on it. For comparison, my dev machine shows that the desktop environment on my dev machine takes up roughly 50% of my CPU percentage. I’m on an octa core processor, so I don’t know if that’s just one core or not, but I digress. The Nano is just over 1% total CPU usage in SSH when running headless.

See the output of the top command on Nano at https://imgur.com/a/EMEwGdV

So, getting back to the camera: The IMX219 appears to use a fisheye lense. That could be useful later, but are there any drawbacks to that? IIRC, it had an option to choose the FOV, and I maxed it out. I figured it would act kind of like how 3D games render a wider view, so I assumed the camera would see a wider area without going fish-eye. I’d still prefer the fish-eye visual range as it could come in useful.

The thing that has me more concerned is the fact that the stream is all red and green. I did get the night vision version of the model, which came with two (I think) IR sensory modules that I attached to the sides. Perhaps that’s why it has red on either side. Is there an easy way to correct the color when I start building with it?

For the color issue you can try to change the badge info to try.

modules {
            module0 {
                badge = "imx185_bottom_liimx185";

Where do I use that code?

It’s in the device tree.
Have a check the document to download the kernel source to build the dtb to replace it.