Cannot start triton server (command returned a non-zero code: 126)

• Hardware Platform Jetson
• DeepStream Version 6.3
• JetPack Version 5.1
• TensorRT Version 5.1
• Issue Type: questions

I cloned this repo: NVIDIA-AI-IOT/tao-toolkit-triton-apps: Sample app code for deploying TAO Toolkit trained models to Triton (github.com) and followed instructions for installation. When I run start_server.sh I get non-zero code: 126 in step 8:
Full docker output:

WARNING! Using --password via the CLI is insecure. Use --password-stdin.
WARNING! Your password will be stored unencrypted in /home/testconfig/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded
DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
            Install the buildx component to build images with BuildKit:
            https://docs.docker.com/go/buildx/

Sending build context to Docker daemon  10.61MB
Step 1/13 : FROM nvcr.io/nvidia/tritonserver:23.02-py3
 ---> ed2b91b8b9dc
Step 2/13 : RUN apt-get update &&     apt-get install -y pkg-config &&     apt-get install -y git &&     apt-get install -y zlib1g &&     apt-get install -y zlib1g-dev
 ---> Using cache
 ---> f7a977925c28
Step 3/13 : RUN cd /tmp     && wget https://github.com/Kitware/CMake/releases/download/v3.14.4/cmake-3.14.4-Linux-x86_64.sh      && chmod +x cmake-3.14.4-Linux-x86_64.sh      && ./cmake-3.14.4-Linux-x86_64.sh --prefix=/usr/local --exclude-subdir --skip-license      && rm ./cmake-3.14.4-Linux-x86_64.sh     && cd -
 ---> Using cache
 ---> 2a936b9bd4d9
Step 4/13 : RUN cd /opt
 ---> Using cache
 ---> a551f160ba7b
Step 5/13 : RUN ln -s /usr/bin/python3 /usr/bin/python
 ---> Using cache
 ---> e9f5a3570e27
Step 6/13 : ENV TRT_TAG "release/8.5"
 ---> Using cache
 ---> 8548bb05c69d
Step 7/13 : ENV TRT_INCLUDE_DIR="/usr/include/x86_64-linux-gnu"
 ---> Using cache
 ---> ac74cc006604
Step 8/13 : RUN mkdir trt_oss_src &&     cd trt_oss_src &&     echo "$PWD Building TRT OSS..." &&     git clone -b $TRT_TAG https://github.com/NVIDIA/TensorRT.git TensorRT &&     cd TensorRT &&     git submodule update --init --recursive &&     mkdir -p build && cd build  &&     cmake .. -DGPU_ARCHS="53 60 61 70 75 80 86 90" -DTRT_LIB_DIR=/usr/lib/x86_64-linux-gnu -DTRT_BIN_DIR=`pwd`/out -DCUDA_VERSION=11.8 -DCUDNN_VERSION=8.7 &&     make -j32 &&     cp libnvinfer_plugin.so.8.5.3 /usr/lib/x86_64-linux-gnu/libnvinfer_plugin.so.8.5.3 &&     cp libnvinfer_plugin_static.a /usr/lib/x86_64-linux-gnu/libnvinfer_plugin_static.a &&     cp libnvonnxparser.so.8.5.3 /usr/lib/x86_64-linux-gnu/libnvonnxparser.so.8.5.3 &&     cp libnvcaffeparser.so.8.5.3 /usr/lib/x86_64-linux-gnu/libnvcaffeparser.so.8.5.3 &&     cp trtexec /usr/local/bin/ &&     cd ../../../ &&     rm -rf trt_oss_src
 ---> Running in 0f852d442a88
/opt/tritonserver/trt_oss_src Building TRT OSS...
Cloning into 'TensorRT'...
Updating files: 100% (1872/1872), done.
Submodule 'parsers/onnx' (https://github.com/onnx/onnx-tensorrt.git) registered for path 'parsers/onnx'
Submodule 'third_party/cub' (https://github.com/NVlabs/cub.git) registered for path 'third_party/cub'
Submodule 'third_party/protobuf' (https://github.com/protocolbuffers/protobuf.git) registered for path 'third_party/protobuf'
Cloning into '/opt/tritonserver/trt_oss_src/TensorRT/parsers/onnx'...
Cloning into '/opt/tritonserver/trt_oss_src/TensorRT/third_party/cub'...
Cloning into '/opt/tritonserver/trt_oss_src/TensorRT/third_party/protobuf'...
Submodule path 'parsers/onnx': checked out 'fd119fec8565264add819f8edc801066116a32dd'
Submodule 'third_party/onnx' (https://github.com/onnx/onnx.git) registered for path 'parsers/onnx/third_party/onnx'
Cloning into '/opt/tritonserver/trt_oss_src/TensorRT/parsers/onnx/third_party/onnx'...
Submodule path 'parsers/onnx/third_party/onnx': checked out 'f7ee1ac60d06abe8e26c9b6bbe1e3db5286b614b'
Submodule 'third_party/benchmark' (https://github.com/google/benchmark.git) registered for path 'parsers/onnx/third_party/onnx/third_party/benchmark'
Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'parsers/onnx/third_party/onnx/third_party/pybind11'
Cloning into '/opt/tritonserver/trt_oss_src/TensorRT/parsers/onnx/third_party/onnx/third_party/benchmark'...
Cloning into '/opt/tritonserver/trt_oss_src/TensorRT/parsers/onnx/third_party/onnx/third_party/pybind11'...
Submodule path 'parsers/onnx/third_party/onnx/third_party/benchmark': checked out '0d98dba29d66e93259db7daa53a9327df767a415'
Submodule path 'parsers/onnx/third_party/onnx/third_party/pybind11': checked out 'ffa346860b306c9bbfb341aed9c14c067751feb8'
Submodule path 'third_party/cub': checked out 'c3cceac115c072fb63df1836ff46d8c60d9eb304'
Submodule path 'third_party/protobuf': checked out 'fb6f8da08b60b6beb5bb360d79dd3feda0147da7'
Submodule 'third_party/benchmark' (https://github.com/google/benchmark.git) registered for path 'third_party/protobuf/third_party/benchmark'
Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/protobuf/third_party/googletest'
Cloning into '/opt/tritonserver/trt_oss_src/TensorRT/third_party/protobuf/third_party/benchmark'...
Cloning into '/opt/tritonserver/trt_oss_src/TensorRT/third_party/protobuf/third_party/googletest'...
Submodule path 'third_party/protobuf/third_party/benchmark': checked out '5b7683f49e1e9223cf9927b24f6fd3d6bd82e3f8'
Submodule path 'third_party/protobuf/third_party/googletest': checked out '5ec7f0c4a113e2f18ac2c6cc7df51ad6afc24081'
/bin/bash: /usr/local/bin/cmake: cannot execute binary file: Exec format error
The command '/bin/sh -c mkdir trt_oss_src &&     cd trt_oss_src &&     echo "$PWD Building TRT OSS..." &&     git clone -b $TRT_TAG https://github.com/NVIDIA/TensorRT.git TensorRT &&     cd TensorRT &&     git submodule update --init --recursive &&     mkdir -p build && cd build  &&     cmake .. -DGPU_ARCHS="53 60 61 70 75 80 86 90" -DTRT_LIB_DIR=/usr/lib/x86_64-linux-gnu -DTRT_BIN_DIR=`pwd`/out -DCUDA_VERSION=11.8 -DCUDNN_VERSION=8.7 &&     make -j32 &&     cp libnvinfer_plugin.so.8.5.3 /usr/lib/x86_64-linux-gnu/libnvinfer_plugin.so.8.5.3 &&     cp libnvinfer_plugin_static.a /usr/lib/x86_64-linux-gnu/libnvinfer_plugin_static.a &&     cp libnvonnxparser.so.8.5.3 /usr/lib/x86_64-linux-gnu/libnvonnxparser.so.8.5.3 &&     cp libnvcaffeparser.so.8.5.3 /usr/lib/x86_64-linux-gnu/libnvcaffeparser.so.8.5.3 &&     cp trtexec /usr/local/bin/ &&     cd ../../../ &&     rm -rf trt_oss_src' returned a non-zero code: 126

this issue would be outside of DeepStream. You could try asking in the triton forum , Thanks!

This github can’t work on Jetson

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.