No libGL.so

Trying to build opencv_contrib ran into issue of not finding libGL.so. So went to /usr/lib/aarch64-linux-gnu dir and sure enough the softlink was bad. There is no libGL.so in the tegra directory. Jetpack 4.1.

Hi,

This is a known issue.
The root cause is that OpenCV includes the headers in an improper order.

It should like this:

# include <GL/gl.h>
# include <cuda_gl_interop.h>

Please check this topic for the solution:
https://devtalk.nvidia.com/default/topic/1007290

Thanks.

I checked that .h file its correct on cuda 10 on Xavier. What is missing is the actual libGL.so

Do you have package “libgl1” and “libglvnd-dev” installed? R31.x should have the following provided by that package (either as a file or a sym link to the file):

/usr/lib/aarch64-linux-gnu/libGL.so
/usr/lib/aarch64-linux-gnu/libGL.so.1
/usr/lib/aarch64-linux-gnu/libGL.so.1.0.0
# dpkg -S /usr/lib/aarch64-linux-gnu/libGL.so /usr/lib/aarch64-linux-gnu/libGL.so.1 /usr/lib/aarch64-linux-gnu/libGL.so.1.0.0
libglvnd-dev:arm64: /usr/lib/aarch64-linux-gnu/libGL.so
libgl1:arm64: /usr/lib/aarch64-linux-gnu/libGL.so.1
libgl1:arm64: /usr/lib/aarch64-linux-gnu/libGL.so.1.0.0

It does seem odd that the “libGL.so” symbolic link is provided by a different package than the hard link, but that’s part of how Ubuntu 18.04 is packaged. Not sure why it was done that way.

What about using this command ?

sudo ln -s /usr/lib/aarch64-linux-gnu/libGL.so.1 /usr/libGL.so

@ToTheFuture: That location will not work. Perhaps you’re suggesting symlinking to /usr/lib/libGL.so ?
However, libraries in /usr/lib/aarch64-linux-glu/ will be picked up by the linker if present.
If you think they’re not being found, “sudo ldconfig” should fix that right up.

Hi,

We cannot reproduce this issue on environment.

1. Flash Xavier with JetPack4.1:
https://developer.nvidia.com/embedded/downloads#?search=4.1%20DP%20EA

2. libGL.so is in our ‘/usr/lib/aarch64-linux-gnu’ folder.
nvidia@jetson-0422418042099:/usr/lib/aarch64-linux-gnu$ ll libGL.so*
lrwxrwxrwx 1 root root 14 Jun 5 14:16 libGL.so → libGL.so.1.0.0
lrwxrwxrwx 1 root root 14 Jun 5 14:16 libGL.so.1 → libGL.so.1.0.0
-rw-r–r-- 1 root root 972968 Jun 5 14:16 libGL.so.1.0.0

Could you re-flash your device to see if any helps?
Thanks.

Thought it was in the tegra folder. I used the one you refer too and successfully built opencv 3.4. Its always been in the tegra folder on previous machines.

I’m building a docker image for my Jetson AGX. The Jetson AGX Jetpack 4.1.1 OS image is based on Ubuntu-18.04, but the docker image that I need is based on Ubuntu-16.04 due to compatibility reasons for running ROS Kinetic.

I’m currently missing the libGL.so files in my root filesystem. I have manually applied the driver binaries using the apply_binaries.sh script to an arm64v8/ubuntu:xenial base image using the latest JetPack 4.1.1 driver downloaded from the following location:

https://developer.download.nvidia.com/devzone/devcenter/mobile/jetpack_l4t/4.1.1/xddsn.im/JetPackL4T_4.1.1_b57/Jetson_Linux_R31.1.0_aarch64.tbz2

The libGL.so file exists in the 1.1GB rootfs image, but that is for Ubuntu-18.04. Is it possible to get the libGL.so file for Ubuntu-16.04?

Xavier was never released with 16.04. On the other hand, the base architecture of the TX2 might work, and you could get that from the apply_binaries.sh step of L4T R28.2.1 (if you know which tar file it is in you could just extract that file). I won’t guarantee it works, but you might try that. The R28.2.1 download page is:
[url]https://developer.nvidia.com/embedded/linux-tegra-r2821[/url]
(you might need to go there, log in, and then hit the URL a second time before you can see the content)

Thanks. A quick look at one of the earlier JetPack 3.2.1 releases for the tegra driver package shows that libGL.so and other files are indeed present. I’ll try to build an Ubuntu-16.04 docker image file and see if those driver libraries work on the Jetson AGX.

On a related note, I tried building an Ubuntu-18.04 docker image but the JetPack-4.4.1 supplied tegra driver library does have libGL.so and other associated files. How can I get the Tegra specific libGL.so files for Ubuntu-18.04 for Jetson AGX? I think the problem that I’m getting with the Ubuntu-18.04 might be because the current libGL.so are not provided by tegra but by the mesa libGL.so. Would you happen to know how OpenGL support is supposed to be setup for a new SoC like the Tegra, so that I can build a working Ubuntu-18.04 docker image?

I’ve created a separate post about it here:
https://devtalk.nvidia.com/default/topic/1043951/jetson-agx-xavier/docker-gpu-acceleration-on-jetson-agx-for-ubuntu-18-04-image/

I built a docker image using Ubuntu-16.04 and the nvidia drivers contained in the Tegra186_Linux_R28.2.1_aarch64.tbz2 package, and tried to run the image on the Jetson AGX. It still gives me the following error, which correct me if I am wrong is probably related to the nvidia drivers being present in the docker image, but cannot find the GPU ?

LIBGL_DEBUG=verbose glxgears
X Error of failed request:  BadValue (integer parameter out of range for operation)
  Major opcode of failed request:  154 (GLX)
  Minor opcode of failed request:  3 (X_GLXCreateContext)
  Value in failed request:  0x0
  Serial number of failed request:  27
  Current serial number in output stream:  28

Are the following /dev nodes the only ones required for GPU control, and used by the Tegra graphics driver, or are more required for the Jetson AGX GPU for control by the tegra driver?

--device /dev/nvhost-as-gpu \
  --device /dev/nvhost-ctrl \
  --device /dev/nvhost-ctrl-gpu \
  --device /dev/nvhost-ctxsw-gpu \
  --device /dev/nvhost-dbg-gpu \
  --device /dev/nvhost-gpu \
  --device /dev/nvhost-prof-gpu \
  --device /dev/nvhost-sched-gpu \
  --device /dev/nvhost-tsg-gpu \
  --device /dev/nvmap \

This is the output of various environment libraries and location of various files in my Ubuntu-16.04 docker image:

$ echo $LD_LIBRARY_PATH
/usr/local/cuda/lib64:/usr/lib/aarch64-linux-gnu/tegra:/usr/lib/aarch64-linux-gnu/tegra-egl:/usr/lib/aarch64-linux-gnu:/usr/local/lib:

$ cd /usr
$ find . -name libGL*
./lib/aarch64-linux-gnu/tegra/libGLX.so.0
./lib/aarch64-linux-gnu/tegra/libGL.so.1
./lib/aarch64-linux-gnu/tegra/libGLX_nvidia.so.0
./lib/aarch64-linux-gnu/tegra/libGLdispatch.so.0
./lib/aarch64-linux-gnu/tegra/libGL.so
./lib/aarch64-linux-gnu/mesa/libGL.so.1
./lib/aarch64-linux-gnu/mesa/libGL.so.1.2.0
./lib/aarch64-linux-gnu/tegra-egl/libGLESv1_CM.so.1
./lib/aarch64-linux-gnu/tegra-egl/libGLESv2.so.2
./lib/aarch64-linux-gnu/tegra-egl/libGLESv2_nvidia.so.2
./lib/aarch64-linux-gnu/tegra-egl/libGLESv2.so
./lib/aarch64-linux-gnu/tegra-egl/libGLESv1_CM_nvidia.so.1

I additionally ran the CUDA deviceQuery sample from within the container and it confirms that the GPU is inaccessible.

./deviceQuery Starting...

 CUDA Device Query (Runtime API) version (CUDART static linking)

cudaGetDeviceCount returned 35
-> CUDA driver version is insufficient for CUDA runtime version
Result = FAIL

I’ve posted the solution for getting an OpenGL docker image with Ubuntu-16.04 and JetPack-4.4.1 for the Jetson AGX platform here:

https://devtalk.nvidia.com/default/topic/1044051/jetson-agx-xavier/docker-gpu-acceleration-on-jetson-agx-for-ubuntu-16-04-image-using-jetpack-4-4-1/

Just a thought on the topic which might help in the future. “libgl” is probably distributed in other packages, but “libglx” is specific to the GPU and is a custom file provided by the L4T install (L4T copies this into the sample rootfs). You’ll note that the files specific to the particular Jetson release have checksums listed in “/etc/nv_tegra_release”, and that libGLX.so is provided by NVIDIA, but libGL.so is not.

libGLX is tied to an ABI in Xorg, and so this is dependent on being both a match to the GPU and to the Xorg ABI. For example:

egrep 'Video Driver:' /var/log/Xorg.0.log

(if one were to port libglx.so, then Xorg would also have to be ported such that the ABI matches)

Sometimes (not often) a mesa package can overwrite the libGLX.so within the Xorg file tree. On the other hand, the “/usr/lib/aarch64-linux-gnu/tegra/libglx.so” is untouched by any mesa package, and is a duplicate of the Xorg one. In the case of a libglx.so failure just copy from the bad location to the good location. Running “sha1sum -c /etc/nv_tegra_release” will indicate which is the bad version and which is the good version.