Why do cross compile gcc used ld-2.25.so in jetson, but it was ld-2.27.so in ubuntu of jetson?

I found ld-2.27.so was used in ubuntu OS of jetson AGX and it was that Glibc version was 2.27. However, the cross compiler gcc proposed officially by nvidia is ld-2.25.so, and Glibc version was 2.25.
((generic-no-api_r2))

Toolchain Information

•GCC version: 7.3.1
•Binutils version: 2.28.2.20170706
•Glibc version: 2.25

Why are they different !!!???

I think both can be used.

I got following error msg with cross compile,
/lib/aarch64-linux-gnu/libthread_db-1.0.so: undefined reference to `ps_pdwrite’

I think it caused by different glibc versions.

Does it relative to the ld version?

Do you have ideas about this problem?

About this:
/lib/aarch64-linux-gnu/libthread_db-1.0.so: undefined reference to ps_pdwrite’`

This means that individual code content was successfully compiled, and then the linker was connecting dynamically loadable content. At the moment of failure it was loading:
/lib/aarch64-linux-gnu/libthread_db-1.0.so

Apparently you have this file since otherwise the error would have differed. The subtle issue is that libthread_db-1.0.so is itself dynamically loading other content. Verify that this shows up:
ls -l /lib/aarch64-linux-gnu/libthread_db-1.0.so
(I’m betting it is there and ok)

Next, what do you see from:
ldd /lib/aarch64-linux-gnu/libthread_db-1.0.so

It should be something like this:

# ldd libthread_db-1.0.so 
        linux-vdso.so.1 (0x0000007f9b519000)
        libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000007f9b357000)
        /lib/ld-linux-aarch64.so.1 (0x0000007f9b4ee000)

The part to think about is that each of those three lines are other libraries which must be present, and perhaps those are part of the same package which installed libthread_db-1.0.so, but some of it is not from the same package. Presumably one of the files being chain linked is either missing or of a release version missing the function “ps_pdwrite”.

The next question is are you compiling natively, versus if you are cross compiling? If cross compiling, then you might want to use a clone of the actual Jetson for the correct linker content (you could loopback mount this on the PC and link to this rather than installing new runtime environment content).

If this is performed natively, then you are probably missing a dependency which needs to be added (and that missing dependency might be as you suspect: Due to a version mismatch, or it might be something else).

I performed cross compile to get app locally, and configure as follows. I got error that “/lib/aarch64-linux-gnu/libthread_db-1.0.so: undefined reference to `ps_pdwrite’”

QMAKE_LFLAGS += -Wl,-rpath=$${NVSDK_PATH}/lib
QMAKE_LFLAGS += -Wl,-rpath=$${NVSDK_PATH}/lib/aarch64-linux-gnu
QMAKE_LFLAGS += -Wl,-rpath=$${NVSDK_PATH}/usr/lib
QMAKE_LFLAGS += -Wl,-rpath=$${NVSDK_PATH}/usr/lib/aarch64-linux-gnu
QMAKE_LFLAGS += -Wl,-rpath=$${NVSDK_PATH}/usr/lib/aarch64-linux-gnu/tegra
QMAKE_LFLAGS += -Wl,-rpath=$${NVSDK_PATH}/usr/lib/aarch64-linux-gnu/tegra-egl
QMAKE_LFLAGS += -Wl,-rpath=$${NVSDK_PATH}/usr/local/lib
QMAKE_LFLAGS += -Wl,-rpath=$${NVSDK_PATH}/usr/local/cuda/lib64
QMAKE_LFLAGS += -Wl,-rpath=$${NVSDK_PATH}/usr/lib/gcc/aarch64-linux-gnu/7

LIBS += $${NVSDK_PATH}/lib/aarch64-linux-gnu/*.so
LIBS += $${NVSDK_PATH}/usr/lib/aarch64-linux-gnu/*.so

LIBS += -L$${NVSDK_PATH}/usr/local/cuda/lib64 -lcudart -lcublas -lcublasLt
#LIBS += -L$${NVSDK_PATH}/usr/local/lib -lboost_atomic -lboost_numpy -lboost_chrono -lboost_random
LIBS += -L$${NVSDK_PATH}/usr/local/lib -lopencv_core -lopencv_imgproc -lopencv_imgcodecs -lopencv_videoio -lopencv_video -lopencv_freetype -lopencv_dnn

For background, there are essentially up to three components to cross compiling. You will always need the cross toolchain, which is the compiler. If you are compiling for “bare metal” (which a Linux kernel or U-Boot qualify as), then this is all you need. In that case there is no linking of external libraries. You already have this.

If you are working on user space, e.g., a program which runs under Linux, then you need two more components. The first is the tool which does the linking. You already have this. This specification is telling the compiler (actually, the linker which the compiler is talking to) where to find link content:

LIBS += $${NVSDK_PATH}/lib/aarch64-linux-gnu/*.so
LIBS += $${NVSDK_PATH}/usr/lib/aarch64-linux-gnu/*.so

The actual content being linked is the third element, and this is what you are finding only partially there. This is known as the “sysroot”. Tool distributors give a complete cross toolchain and cross linker, but they cannot distribute a complete sysroot. The sysroot depends on what you are building and what environment you are building for. This is the part of the sysroot which you must have in the linking location as a minimum:

-lcudart -lcublas -lcublasLt
 -lboost_atomic -lboost_numpy -lboost_chrono -lboost_random
 -lopencv_core -lopencv_imgproc -lopencv_imgcodecs -lopencv_videoio -lopencv_video -lopencv_freetype -lopencv_dnn

I emphasize “minimum” because each of those libraries can in turn require linking to yet another library. Then entire chain of dependencies must be present, but you are missing some of this content (not necessarily the files you are linking against, but what one of these chain links to).

One would name a location for searching for libraries with a capital '-L" (e.g., “-L /some/where”), but name specific files without a location using lower case “-l” (e.g., “-lcudart”), and the two would not mix full paths to location. In a case where you you linked a library (probably something like “-lthread_db-1.0”) which needed another library it would have searched the “-L” locations for the needed symbol, but failed to find the symbol. That library needs to be added to a “-L” location. Your sysroot is incomplete.

Note that “-rpath” is a variant of the “-L” combined with “-l”, but the content is searched for when running and not during compile.

An option for a missing sysroot component is to copy it over from the actual environment, meaning the Jetson. If you were to go to the Jetson and want to print all libraries in the standard search path you could do this:
ldconfig -p

If you wanted to search for libraries with libthread_db in the name:
ldconfig -p | grep 'libthread_db'

In one case on a Jetson this provides:

# ldconfig -p | grep 'libthread_db'
        libthread_db.so.1 (libc6,AArch64, OS ABI: Linux 3.7.0) => /lib/aarch64-linux-gnu/libthread_db.so.1
        libthread_db.so (libc6,AArch64, OS ABI: Linux 3.7.0) => /usr/lib/aarch64-linux-gnu/libthread_db.so

One will be a symbolic link, the other a hard link. Check:

# Example returned from another Jetson...
# ls -l /lib/aarch64-linux-gnu/libthread_db* /usr/lib/aarch64-linux-gnu/libthread_db*
-rw-r--r-- 1 root root 31384 Jun  4  2020 /lib/aarch64-linux-gnu/libthread_db-1.0.so
lrwxrwxrwx 1 root root    19 Jun  4  2020 /lib/aarch64-linux-gnu/libthread_db.so.1 -> libthread_db-1.0.so
lrwxrwxrwx 1 root root    40 Jun  4  2020 /usr/lib/aarch64-linux-gnu/libthread_db.so -> /lib/aarch64-linux-gnu/libthread_db.so.1

You could copy the original hard link file of that name, but renamed with the name the linker is expected, to your cross compile sysroot path (but you’d be using the file which has symbol “ps_pdwrite”, not the thread file in my example…you already have the thread file, and this is just for illustration). One can use “nm” on a library to see what symbols it provides if symbols are not stripped. It is easier to just provide a full environment from the Jetson rather than fixing it one file at a time for every build case.

Instead of just a few files in your “-L” library path for the cross linker you could clone the Jetson, and loopback mount it, then name the loopback mounted library locations for the search path. So long as compile content was complete on the Jetson, then you know you have everything you’ll ever need for any user space program to compile in the cross compile environment. The problem is not the version of ld, the problem is missing sysroot, and a clone (which is also good to have for backup reasons) is sort of the biggest hammer solution.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.