Build Jetson Deepstream on x86

I am trying to build an image using the build-push action for Nvidia Jetson architecture (linux/arm/v8).

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 5.1
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I wanted to take advantage of cross compilation docker features but it is not working as expected.

This simple Dockerfile works when I build it on Nvidia Jetson but it does not when I try to build from another system:

# syntax=docker/dockerfile:experimental

FROM nvcr.io/nvidia/deepstream-l4t:5.0.1-20.09-samples as build

WORKDIR / 

RUN --mount=type=cache,id=apt-build,target=/var/cache/apt \
    apt update && apt install -y \
        git \
        wget \
        cmake \
        g++ && \
    rm -rf /var/lib/apt/lists/*

RUN git clone --depth=1 --single-branch --branch patch-1 https://github.com/mmeendez8/amirstan_plugin.git && \ 
    cd /amirstan_plugin && \ 
    git submodule update --init --progress --depth=1 && \
    mkdir build && \
    cd build && \
    cmake .. -DWITH_DEEPSTREAM=true && \
    make -j10

Jetson build command:

docker buildx build --f Dockerfile -t deepstream:jetson .

x86 build command:
docker buildx build --platform linux/arm64 -f Dockerfile -t deepstream:jetson .

Error output on x86 shows that:

#11 38.05 -- Found TensorRT headers at TENSORRT_INCLUDE_DIR-NOTFOUND
#11 38.06 -- Find TensorRT libs at TENSORRT_LIBRARY_INFER-NOTFOUND;TENSORRT_LIBRARY_PARSERS-NOTFOUND;TENSORRT_LIBRARY_INFER_PLUGIN-NOTFOUND
#11 38.07 -- Could NOT find TENSORRT (missing: TENSORRT_INCLUDE_DIR TENSORRT_LIBRARY) 
#11 38.07 ERRORCannot find TensorRT library.

I guess the deepstream jetson base container is build upon the l4t base container (NVIDIA NGC). In its documentation I can read the following:

The platform specific libraries and select device nodes for a particular device are mounted by the NVIDIA container runtime into the l4t-base container from the underlying host, thereby providing necessary dependencies for l4t applications to execute within the container. This approach enables the l4t-base container to be shared between various Jetson devices.

So I guess there is no way to make this cross compilation work with jetson containers since most of dependencies are on the Jetson host system. Is this correct?

Thanks

Hey customer, we will check it and update you ASAP.

Any news about this?

Hey customer, most deepstream plugins and low level libraris are not open sourced, so I don’t think you can cross build it inside the container.