Camera access in headless mode

Hello,

I am upgrading my project to use Jetson Orin (6.2) instead of Jetson Nano (JP 4.6). I have a problem with launching camera preview in headless mode through NoMachine. In my setup I have Jetson Orin, without display, this is the production requirement.
I can access the system via remote connection over NoMachine, which so far works the same as in the old system.
The problem starts when I try to capture the frame using nvarguscamerasrc.

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! queue ! nvvidconv ! 'video/x-raw,width=640,height=360,format=(string)BGRx' ! queue ! videoconvert ! 'video/x-raw,format=(string)BGR' ! queue ! fpsdisplaysink

As a result, instead a popup with camera preview I get an error:

nvbufsurftransform: Could not get EGL display connection
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'fps-display-video_sink': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayGBM\)\ gldisplaygbm0";
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
GST_ARGUS: Creating output stream
(Argus) Error NotSupported: Failed to initialize EGLDisplay (in src/eglutils/EGLUtils.cpp, function getDefaultDisplay(), line 77)
(Argus) Error BadParameter:  (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 93)
(Argus) Error BadParameter:  (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:318 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:239 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:201 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:849 (propagating)
Redistribute latency...
Got EOS from element "pipeline0".
Execution ended after 0:00:00.113337607
Setting pipeline to NULL ...
Freeing pipeline ...

Camera access does not seem to be an issue, as I get the correct feedback from v4l2:

bartosz@ubuntu:~/CSI-Camera$ v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
	Type: Video Capture

	[0]: 'RG10' (10-bit Bayer RGRG/GBGB)
		Size: Discrete 3280x2464
			Interval: Discrete 0.048s (21.000 fps)
		Size: Discrete 3280x1848
			Interval: Discrete 0.036s (28.000 fps)
		Size: Discrete 1920x1080
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1640x1232
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.017s (60.000 fps)

I tried to create dummy display, but it does not seem to work. It looks similar to this problem, but setting up the DISPLAY=:0 does not help (DISPLAY needs to be =:1001 as that is the remote screen).

Trying to capture a frame and save it to file produces the same error.
Does anyone have any idea?

Hi,

I’m no longer publicly helping here, and I have to say that I haven’t been playing with NoMachine with Jetson for years, though if it could help i might throw some old thoughts:

  • Be sure if local display is :0 after installing NoMachine or TeamViewer. May also be :1
  • IIRC through a ssh -X or -Y connection to Jetson with NoMachine or TeamViewer installed, it was possible to get remote X display, setting DISPLAY=localhost:10.0

Your case might be different.
Have fun anyway!

Hi,

what do you mean by local display? The physical display connected to Jetson? I don’t have it. I have only Orin Nano and NoMachine connected over USB/ETH. $DISPLAY variable is set to :1001 in the NoMachine, and it is working fine in my other project on Jetson Nano (JP4.3)

I don’t want to forward the Xes through ssh, I want to se the camera preview inside NoMachine.
P.S. I tried the ssh -X anyways and it throws the same error.

I slowly get out of my mind, I found similar topics in here and none of them have a solution :/

It looks like it is not the problem with NoMachine nor displays.
I can correctly display the test video stream:

gst-launch-1.0 videotestsrc ! nveglglessink

The problem lays in Argus camera driver

Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:318 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:239 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:201 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:849 (propagating)

I upgraded the JetPack to newest 6.2 version, the problem didn’t go away.

In the argus-deamon log I see this:

sudo journalctl -b -u nvargus-daemon:

Mar 07 20:03:23 tegra-ubuntu nvargus-daemon[797]: === gst-launch-1.0[7677]: CameraProvider destroyed (0xffff95312fa0)=== gst-launch-1.0[7677]: Connection closed (FFFF9C0EB840)=== gst-launch-1.0[76>
Mar 07 20:03:23 tegra-ubuntu nvargus-daemon[797]: OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: ---- imager: No override file found. ----
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: (NvCamV4l2) Error ModuleNotPresent: V4L2Device not available (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function >
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: (NvCamV4l2) Error ModuleNotPresent:  (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize>
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: (NvOdmDevice) Error ModuleNotPresent:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, fun>
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: NvPclStartPlatformDrivers: Failed to start module drivers
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: NvPclDriver_V4L2_Focuser_Stub_Close: Invalid NULL input pPclDriver
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: NvPclStateControllerOpen: Failed ImagerGUID 1. (error 0xA000E)
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: NvPclOpen: PCL Open Failed. Error: 0xf
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 7>
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: SCF: Error BadParameter:  (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 455)
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 382)
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function getSource(), line 554)
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: ---- imager: No override file found. ----
Mar 07 20:03:24 tegra-ubuntu nvargus-daemon[797]: === gst-launch-1.0[7747]: CameraProvider initialized (0xffff9534dfd0)(Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessio>
lines 109-152/152 (END)

I inspected the nvargus-deamon, this is the output of it:

=== NVIDIA Libargus Camera Service (0.99.33)=== Listening for connections...=== gst-launch-1.0[6571]: Connection established (FFFF8100B840)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
(NvCamV4l2) Error ModuleNotPresent: V4L2Device not available (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function findDevice(), line 256)
(NvCamV4l2) Error ModuleNotPresent:  (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize(), line 60)
(NvOdmDevice) Error ModuleNotPresent:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initialize(), line 111)
NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor
NvPclStartPlatformDrivers: Failed to start module drivers
NvPclDriver_V4L2_Focuser_Stub_Close: Invalid NULL input pPclDriver
NvPclStateControllerOpen: Failed ImagerGUID 1. (error 0xA000E)
NvPclOpen: PCL Open Failed. Error: 0xf
SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 725)
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 455)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 382)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function getSource(), line 554)
---- imager: No override file found. ----
=== gst-launch-1.0[6571]: CameraProvider initialized (0xffff7cb71700)X11 connection rejected because of wrong authentication.
X11 connection rejected because of wrong authentication.
X11 connection rejected because of wrong authentication.
X11 connection rejected because of wrong authentication.
(Argus) Error NotSupported: Failed to initialize EGLDisplay (in src/eglutils/EGLUtils.cpp, function getDefaultDisplay(), line 77)
(Argus) Error NotSupported: Failed to get default display (in src/api/EGLOutputStreamImpl.cpp, function initialize(), line 99)
(Argus) Error NotSupported:  (propagating from src/api/CaptureSessionImpl.cpp, function createEGLOutputStream(), line 989)
(Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessionImpl.cpp, function outputStreamDeleted(), line 1106)
(Argus) Error NotSupported:  (propagating from src/api/CaptureSessionImpl.cpp, function createOutputStreamInternal(), line 851)
=== gst-launch-1.0[6571]: CameraProvider destroyed (0xffff7cb71700)=== gst-launch-1.0[6571]: Connection closed (FFFF8100B840)=== gst-launch-1.0[6571]: Connection cleaned up (FFFF8100B840)

Does anyone know if this part :

(NvCamV4l2) Error ModuleNotPresent: V4L2Device not available 

means that there is still problem with communicating with the camera (problem with DTB)?
Is it even possible that the camera information is read correctly, but somehow the full connection can’t be established?

I just found a topic with exactly same problem from over a year ago, and hasn’t been resolved.
There clearly is a problem with running the camera at Jetson Orin JP6.2 in headless mode. I don’t really understand the statement that Argus does not work with virtual display, while it was working in previous JetPacks (I ported the same application to Jetson Orin from Jetson Nano where it worked).
How are we supposed to process the video stream in headless application then? Isn’t it a primary use case of any edge device?
Can someone from NVIDIA support us here?

Just found out that when I log in over ssh without X11 forwarding, camera is working fine. Does anyone know what is the relation between X11 and gstreamer in that case?

With X11 forwarding:

bartosz@bartosz:~$ ssh -X bartosz@192.168.55.1
bartosz@192.168.55.1's password: 

bartosz@localhost:~$ gst-launch-1.0 nvarguscamerasrc num-buffers=10 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=I420' ! nvjpegenc ! filesink location=test1.jpg
libEGL warning: DRI3: failed to query the version
libEGL warning: DRI2: failed to authenticate
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
GST_ARGUS: Creating output stream
(Argus) Error BadParameter:  (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 93)
(Argus) Error BadParameter:  (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:318 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:239 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:201 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:849 (propagating)
Redistribute latency...
Got EOS from element "pipeline0".
Execution ended after 0:00:00.106191917
Setting pipeline to NULL ...
Freeing pipeline ...
bartosz@localhost:~$ 

without X11 forwarding:

bartosz@bartosz:~$ ssh  bartosz@192.168.55.1
bartosz@192.168.55.1's password: 

bartosz@localhost:~$ gst-launch-1.0 nvarguscamerasrc num-buffers=10 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=I420' ! nvjpegenc ! filesink location=test1.jpg
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21,000000 fps Duration = 47619048 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;
GST_ARGUS: 3280 x 1848 FR = 28,000001 fps Duration = 35714284 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;
GST_ARGUS: 1920 x 1080 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;
GST_ARGUS: 1640 x 1232 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;
GST_ARGUS: 1280 x 720 FR = 59,999999 fps Duration = 16666667 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;
GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29,999999 
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
NvMMLiteBlockCreate : Block : BlockType = 1 
Redistribute latency...
Got EOS from element "pipeline0".
Execution ended after 0:00:00.646874143
Setting pipeline to NULL ...
GST_ARGUS: Cleaning up
CONSUMER: Done Success
GST_ARGUS: Done Success
Freeing pipeline ...
bartosz@localhost:~$ 

Hi,
This is expected since Argus stack has dependency to native EGL libs. A possible solution is to set up RTSP or UDP on the Orin Nano device. And the remote device can receive the stream. You may refer to the reference setup in FAQ:
Jetson AGX Orin FAQ

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.