Dear all,
I’m currently working on the integration of the following setup:
- Jetson Orin NX
- Custom carrier board
- IMX 477 sensor camera
It might be useful to highlight it’s a headless setup, no display can be attached to the system. I managed to edit the device tree so that the /dev/video0 node is created and I’ve been able to get “something” from the camera using the following command:
v4l2-ctl --verbose -d /dev/video0 --set-ctrl bypass_mode=0 --set-ctrl override_enable=0 --stream-count=1 --stream-mmap --stream-to=/home/nvidia/Pictures/test/test_0.raw
The output file actually has real data from the sensor as far as I’ve uploaded it to one of those online pixel viewers and I get the following picture when I chose a 16-bit format.
That leads me to conclude that the device tree is fine, at least anything related to the i2cmux for the camera and the csi lanes for that specific connector.
I’ve explored several ways to demosaic that raw data in the Orin and be able to see the right colors but none was successful:
-
I tried the following pipeline:
gst-launch-1.0 -e nvarguscamerasrc num-buffers=1 sensor-id=0 ! “video/x-raw(memory:NVMM),width=3840,height=2160,framerate=30/1” ! nvjpegenc ! multifilesink location=test.jpeg
However, it lead to the following output error:
Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock GST_ARGUS: Creating output stream CONSUMER: Waiting until producer is connected... GST_ARGUS: Available Sensor modes : GST_ARGUS: 3840 x 2160 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 22.250000; Exposure Range min 13000, max 683709000; GST_ARGUS: 1920 x 1080 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 22.250000; Exposure Range min 13000, max 683709000; GST_ARGUS: Running with following settings: Camera index = 0 Camera mode = 0 Output Stream W = 3840 H = 2160 seconds to Run = 0 Frame Rate = 29.999999 GST_ARGUS: Setup Complete, Starting captures for 0 seconds GST_ARGUS: Starting repeat capture requests. CONSUMER: Producer has connected; continuing. nvbuf_utils: dmabuf_fd -1 mapped entry NOT found Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadExecute:694 NvBufSurfaceFromFd Failed. Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:247 (propagating) ERROR: from element /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0: CANCELLED Additional debug info: Argus Error Status EOS on shutdown enabled -- waiting for EOS after Error Waiting for EOS...
Could the
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
error indicate something related to the absence of a display? -
Of course I installed the jetson multimedia API and checked the samples. When running the sample 12_v4l2_camera_cuda, the error is this one:
INFO: camera_initialize(): (line:276) Camera ouput format: (3840 x 2160) stride: 7680, imagesize: 16588800, frate: 0 / 0 [ERROR] (NvEglRenderer.cpp:98) <renderer0> Error in opening display [ERROR] (NvEglRenderer.cpp:154) <renderer0> Got ERROR closing display ERROR: display_initialize(): (line:294) Failed to create EGL renderer ERROR: init_components(): (line:319) Failed to initialize display ERROR: main(): (line:721) Failed to initialize v4l2 components App run failed
And after setting
export DISPLAY=:0
:INFO: camera_initialize(): (line:276) Camera ouput format: (3840 x 2160) stride: 7680, imagesize: 16588800, frate: 0 / 0 [INFO] (NvEglRenderer.cpp:110) <renderer0> Setting Screen width 3840 height 2160 INFO: init_components(): (line:321) Initialize v4l2 components successfully WARN: request_camera_buff(): (line:359) Camera v4l2 buf length is not expected ERROR: request_camera_buff(): (line:364) Failed to enqueue buffers: Bad address (14) ERROR: prepare_buffers(): (line:500) Failed to set up camera buff ERROR: main(): (line:728) Failed to prepare v4l2 buffs App run failed
-
I’ve also tried to modify the argus sample cudaBayerDemosaic. I think I’ve the Bayer EGLStream and everything following it (for example, CUDA Demosaic) properly defined, but it hags waiting for the Argus producer in this loop:
while (true) { EGLint state = EGL_STREAM_STATE_CONNECTING_KHR; if (!eglQueryStreamKHR(m_eglDisplay, m_bayerInputStream, EGL_STREAM_STATE_KHR, &state)) { ORIGINATE_ERROR("Failed to query stream state (possible producer failure)."); } if (state == EGL_STREAM_STATE_NEW_FRAME_AVAILABLE_KHR) { break; } }
The main code is obviously deeply based on the sample, checking for errors at every operation. It ends up launching the capture of the specified number of frames in the following step resulting in no error:
// Submit the batch of capture requests. for (unsigned int frame = 0; frame < options.frameCount(); ++frame) { Argus::Status status; uint32_t result = iCaptureSession->capture(request.get(), TIMEOUT_INFINITE, &status); if (result == 0) { ORIGINATE_ERROR("Failed to submit capture request (status %x)", status); } }
The error comes from a timeout.
-
I’ve also tested the nvargus_nvraw command as follows:
nvargus_nvraw --c 0 --mode 0 --file /home/nvidia/Pictures/test/test_7.jpg --format “raw, jpg” --exp0 “0.000034, 5” --verbosity debug
nvargus_nvraw version 1.14.0 Argus Version: 0.99.3.3 (multi-process) Number of sensors 1 User provided sensor ID 0, Sensor mode 0 Selected Sensor ID 0 Sensor Mode 0 initialize: Default frame duration (33,333,334 to 500,000,224 ns) FPS (30.00 to 2.00) initialize: Frame count 90 computeManualControls: Original Manual Exposure Time 0.000034 secs (34,000 ns). Range (0.000013 to 0.683709 secs) (13,000 to 683,709,000 ns) computeManualControls: Original frame duration range (33,333,334 to 500,000,224 ns) computeManualControls: Warning: Maximum value of Exposure time 0.683709 secs is more than maximum Frame duration of 0.500000 secs Changing Maximum Exposure time to 0.500000 secs computeManualControls: Modified exposure time range (13,000 to 500,000,224 ns) computeManualControls: New frame duration (33,333,334 ns to 33,333,334 ns) FPS (30.00 to 30.00) computeManualControls: Manual Sensor Gain 5.000000. Range (1.000000 to 22.250000) initialize: numExposures 1 HDR Ratio Range: min 1.00 max 1.00 getExposureSetCount: i 0 exposureSet[i].size() 11 getExposureSetCount: i 1 exposureSet[i].size() 0 getExposureSetCount: i 2 exposureSet[i].size() 0 getExposureSetCount: i 3 exposureSet[i].size() 0 getExposureSetCount: i 4 exposureSet[i].size() 0 getExposureSetCount: i 5 exposureSet[i].size() 0 getExposureSetCount: i 6 exposureSet[i].size() 0 getExposureSetCount: i 7 exposureSet[i].size() 0 initialize: Embedded data is supported. Size 15360 Number of sensors 1, Number of sensor modes 2 m_selectedCameraDevice 0xaaaae4f0e1e0, m_sensorMode 0xaaaae4f0e2f0, m_iSensorMode 0xaaaae4f0e360 capture: createCaptureSession succes, value 0xaaaae4f0e530 ICaptureSession::createOutputStream(0) success, value 0xaaaae4f0e680 ICaptureSession::createOutputStream(1) success, value 0xaaaae4f6c340 EGLStream::FrameConsumer::create(0) success, value 0xaaaae4f6ce80 EGLStream::FrameConsumer::create(1) success, value 0xaaaae4f6d4f0 createBuffers: m_queue value 0xaaaae4f6dc50 createBuffers: m_iQueue value 0xaaaae4f6e0b0 createCaptureRequest: Maximum burst request count 6 : Embedded data supported. Calling setMetadataEnable createCaptureRequest: IRequest::enableOutputStream(0) success 0xaaaae4f6e170 createCaptureRequest: IRequest::enableOutputStream(1) success 0xaaaae4f6e170 : Embedded data supported. Calling setMetadataEnable createCaptureRequest: IRequest::enableOutputStream(0) success 0xaaaae4f6e310 createCaptureRequest: IRequest::enableOutputStream(1) success 0xaaaae4f6e310 : Embedded data supported. Calling setMetadataEnable createCaptureRequest: IRequest::enableOutputStream(0) success 0xaaaae4f6e450 createCaptureRequest: IRequest::enableOutputStream(1) success 0xaaaae4f6e450 : Embedded data supported. Calling setMetadataEnable createCaptureRequest: IRequest::enableOutputStream(0) success 0xaaaae4f6e5a0 createCaptureRequest: IRequest::enableOutputStream(1) success 0xaaaae4f6e5a0 : Embedded data supported. Calling setMetadataEnable createCaptureRequest: IRequest::enableOutputStream(0) success 0xaaaae4f6e6e0 createCaptureRequest: IRequest::enableOutputStream(1) success 0xaaaae4f6e6e0 : Embedded data supported. Calling setMetadataEnable createCaptureRequest: IRequest::enableOutputStream(0) success 0xaaaae4f6e820 createCaptureRequest: IRequest::enableOutputStream(1) success 0xaaaae4f6e820 User has not requested a focus position change User has not requested aperture position change setSensorModeAndManualControls: setSensorMode(0xaaaae4f0e2f0) success setSensorModeAndManualControls: Setting manual Exposure Time to 34,000 ns setSensorModeAndManualControls: Setting manual Sensor Gain to 5.00 setSensorModeAndManualControls: Setting frame duration to (33,333,334, 33,333,334 ns) User has not requested a focus position change User has not requested aperture position change setSensorModeAndManualControls: setSensorMode(0xaaaae4f0e2f0) success User has not requested a focus position change User has not requested aperture position change setSensorModeAndManualControls: setSensorMode(0xaaaae4f0e2f0) success User has not requested a focus position change User has not requested aperture position change setSensorModeAndManualControls: setSensorMode(0xaaaae4f0e2f0) success User has not requested a focus position change User has not requested aperture position change setSensorModeAndManualControls: setSensorMode(0xaaaae4f0e2f0) success User has not requested a focus position change User has not requested aperture position change setSensorModeAndManualControls: setSensorMode(0xaaaae4f0e2f0) success captureRequest: frame 0, value 0xaaaae4f6e170 captureRequest: frame 0 iQueue->getSize() 1 captureRequest: frame 0 EVENT_TYPE_ERROR ("nvargus_nvraw") Error BadParameter (0x04): No events in queue (in capture_nvraw/src/mobile/ArgusNvRawCapture.cpp, func captureRequest(), line 700) ("nvargus_nvraw") Error BadParameter (0x04): (propagating from capture_nvraw/src/mobile/ArgusNvRawCapture.cpp, func capture(), line 810) deinitialize:++ capture_deinitialize:++ (Argus) Error Timeout: (propagating from src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 137) (Argus) Error Timeout: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 91) capture_deinitialize:-- (Argus) Error Timeout: (propagating from src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 137) (Argus) Error Timeout: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 91) (Argus) Error InvalidState: Argus client is exiting with 2 outstanding client threads (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 366) deinitialize:-- ("nvargus_nvraw") Error BadParameter (0x04): Unable to capture (propagating from capture_nvraw/src/mobile/main.cpp, func main(), line 97) capture_deinitialize:++ capture_deinitialize:-- deinitialize:++ capture_deinitialize:++ capture_deinitialize:-- deinitialize:--
These are my first steps dealing with image sensors at this level, so I may be missing something basic.
Regards,
Jose.