Hello there,
I am working with Jetson nano in handless mode through SSH, I have ZED camera plunged into the nano and trying to get live feed through ssh to another computer,the problem is when I run the ZED explorer (launcher for the ZED camera) it gives me this error
nvbuf_utils: Could not get EGL display connection
Error in Shader source code
Information written in shaderLog.txt
Wrong location, does this var exist : "texInput"? Is it used in the program? (May be the GLCompiler swapped it)
Trying : texInput<-0
then the explorer shuts down, I tried other methods like running it through Cheese, openCV and ROS and managed to get it to work in all of them but the FPS was way too low (around 2-5 fps)
I also tried other cameras but still the same issue.
I found a post on jetson TX2 with the same issue but none of the methods there worked for me
PS
I’m using SSH -C -X command to connect to the nano
Hi,
Please share clear steps( system setup and gstreamer pipeline ) so that we can reproduce the failure. Do you flash Jetpack4.2.2(r32.2.1) via sdkmanager?
for gstreamer I set it up using the command sudo apt install -y gstreamer1.0-plugins-base
after that I just tried different cameras through ssh
I don’t really know how to use gstreamer to be honest I just downloaded it as a dependency for openCV
can you please specify what do you need from gstreamer pipline and how can I do it?
this is the full system setup after installing jetpack if that helps
Screen 0: minimum 8 x 8, current 1366 x 768, maximum 16384 x 16384
HDMI-0 connected primary 1366x768+0+0 (normal left inverted right x axis y axis) 0mm x 0mm
1366x768 59.97*+
1920x1080 60.00 59.95 50.00 24.00 23.98
1280x768 59.99
1280x720 60.00 59.94 50.00
1024x768 60.01
800x600 60.32 56.25
720x576 50.00
720x480 59.94
720x400 70.04
640x480 59.94 59.94
DP-0 disconnected (normal left inverted right x axis y axis)
Can you please direct me to a list of packages that I can install to run cameras using ssh or any other packages that I should install after flashing the jetson nano
Thanks for the reply
Yes I did check that but I was talking about any necessary packages for the Jetson nano to setup network connection or display packages that need to be installed in order to have camera stream through SSH
That might be the problem
Be also aware that nvivafilter silently falls back to libnvsample_cudaprocess.so in standard path (/usr/lib/aarch64…) if your custom lib cannot be found explictly (such as ./libnvsample_cudaprocess.so when your are in the right directory where this lib has beeen generated) or in a directory listed first in environment variable LD_LIBRARY_PATH.
Okay so I checked the path (/usr/lib/aarch64…) and found libnvsample_cudaprocess.so there
I assume by custom lib you mean nvbuf_utils because I don’t really have much experience regarding all of this so can you please explain more or tell me what I need to look for
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), format=NV12, framerate=30/1' ! nvvidconv ! videoconvert ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:521 No cameras available
Got EOS from element "pipeline0".
Execution ended after 0:00:00.296856486
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
assuming that /dev/video1 is the video node for your ZED (should appear a few seconds after you plug it in). If you don’t have any CSI camera connected, it may be video0.
About custom lib, nvivafilter plugin uses a library providing a function for processing one frame. The original lib is the one you’ve found. But it doesn’t do so much for cuda-process, so you would want to rebuild your own (you would download sources for this), and provide location of this lib as nvivafilter option customer-lib-name as you did in post #5. If you provide a bad location and nvivafilter cannot find this lib, it would silently fall back to use the lib in /usr/lib/aarch64…
at=YUY2 ! videoconvert ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
the camera worked but I am still having very low fps around 5 or so