I use my own camera connet to Nano through MIPI, my camera output 1920*1080@25fps, pix is:
V4L2_PIX_FMT_UYVY and MEDIA_BUS_FMT_UYVY8_2X8
I can use vlc and v4l2-ctl tools get images from /dev/videox, but when I use nvgstcapture-1.0 to preview the images, it’s could not work, and report such error:
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected…
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:859 Failed to create Request
(Argus) Error InvalidState: (propagating from src/eglstream/FrameConsumerImpl.cpp, function streamEventThread(), line 138)
(Argus) Error InvalidState: (propagating from src/eglstream/FrameConsumerImpl.cpp, function streamEventThreadStatic(), line 180)
ERROR on bus: by /GstPipeline:capture_native_pipeline: Output window was closed
debug info:
/dvs/git/dirty/git-master_linux/tests-multimedia/nvgstapps/nvgstcapture-1.0/nvgstcapture.c(4160): nvgst_handle_xevents (): /GstPipeline:capture_native_pipeline
(Argus) Error Timeout: (propagating from src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 137)
(Argus) Error Timeout: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 92)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadExecute:334 Stream failed to connect.
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:245 (propagating)
Segmentation fault
could you give me some advice, why it not work? Thanks a lot!
Thanks for visiting the NVIDIA Developer Forums.
To ensure better visibility and support, I’ve moved your post to the Jetson category where it’s more appropriate
I’am sure this problem has relationship with pixel format, when use bayyer sensor with RG10, it works well, but YUV can’t work. is libargus plug only support bayyer sensor?
BTW,
you may try with nvv4l2camerasrc plugin as an alternative way to test your YUV camera stream via gst pipeline.
for instance, $ gst-launch-1.0 nvv4l2camerasrc device=/dev/video1 ! 'video/x-raw(memory:NVMM),format=UYVY,width=1920,height=1080,framerate=30/1 ! nvvidconv ! xvimagesink
when I enter this command on my Jetson device, it just echo a prompt “>” on command line, not display a screen in my monitor; if I use “export DISPLAY=:1 ;sudo nvgstcapture-1.0 --sensor-id=1” can display the screen;is that any paremeter I need to add?
it seems like an incorrect command-line, please add an ' before video converter.
for instance, $ gst-launch-1.0 nvv4l2camerasrc device=/dev/video1 ! 'video/x-raw(memory:NVMM),format=UYVY,width=1920,height=1080,framerate=30/1' ! nvvidconv ! xvimagesink
today I got another question, when My camera output 3840*2160 format stream,vlc player can’t play full image on screen, Reduce the image to 1/4 of its original size ; when I use command: