Use jetson camera with headless system?

I’m stuck.

I wish to process a continuous series of images coming from the jetson camera without a display attached.

With argus, no matter what I try (including every supplied demo), if I don’t have a display attached, I get something like the following error:

(Argus) Error NotSupported: Failed to initialize EGLDisplay (in src/eglutils/EGLUtils.cpp, function getDefaultDisplay(), line 75)

Even the yuvJpeg demo requires a display even though it displays nothing.

With gstreamer and appsink I didn’t get as far because I’m unable to control the exposure-time/gain/digital_gain sufficiently for my application.

With v4l2, I can’t seem to get the images to be correct.

Is there any sample code anywhere for the TX2 jetson kit with the supplied camera and jetpack 3.2 (28.2) that can get and control an image stream from a camera without a monitor attached to the display?

Surely, lots of people are doing this, aren’t they? After all, the typical deep learning app is to take images, process them, and then do something without bothering to display anything.

You’ll want to add a virtual desktop (basically desktop sharing). For the GPU to work the video buffer needs to be present…the software won’t care if there is a monitor looking at the buffer or not (the buffer of a virtual server is indistinguishable from one which has a monitor attached). I don’t have any recommendations on which particular virtual desktop to use.

The GPU needs a video buffer? Then why do all my CUDA and NPP calls work headless?

Nonetheless, that’s a great hint and I’ll give it a try! Thanks!

Turned out that X11 was running anyway, perhaps due to
Option “AllowEmptyInitialConfiguration” “true”
in /etc/X11/xorg.conf.

The problem seemed to be that nobody was logged in on the display. I now have user ‘nvidia’ autologin’d on power-up and boot and everything works.

Seems silly to have to do that, but it works now. Yay!

If you log in via ssh or network and run a program which does remote forwarding then events are forwarded to the PC you log in from…the calls still exist, but they go to the PC, not the Jetson. It depends on the nature of the program.

The GPU needs memory to work with, and that memory is expected to be formatted a certain way. It just happens to be called video memory because most of the time it is for video viewing. Ignore the name though, the GPU has memory it must have assigned to it regardless of whether a monitor ever sees it or not. The GPU driver is tied to the X11 server ABI, and it is through this much of the GPU communications is achieved…break the X11 ABI, and you break the GPU. I guess you could say that whether or not it works depends on whether the software requires a context…and if you forward the context to another computer it isn’t getting away without a buffer, it’s just using a different computer’s buffer (which might also be a kind of cheating since it would then be the PC’s GPU doing the work).

Virtual servers do not forward events, they do the work themselves…copies of buffers are instead communicated.