Libtritonserver.so and onnxruntime files are missing from cpu build

I built cpu based triton server using build.py from 22.08 branch for rtx A2000 gpu
Host driver details:
±----------------------------------------------------------------------------+
| NVIDIA-SMI 515.65.01 Driver Version: 515.65.01 CUDA Version: 11.7 |
|-------------------------------±---------------------±---------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA RTX A200… On | 00000000:9E:00.0 Off | Off |
| 30% 40C P8 12W / 70W | 118MiB / 12282MiB | 0% Default |
| | | N/A |
±------------------------------±---------------------±---------------------+

Command I used:
python3 build.py
–enable-logging --enable-stats --enable-tracing --enable-metrics
–endpoint=grpc
–backend=ensemble
–backend=python
Build log file attached:
onnx_build_log_2208.txt

Bulid was sucessful, but after opening the container and check tritonserver some files are missing:
cmd :ldd tritonserver
root@ffd5ccae94a8:/workspace/bin# ldd tritonserver

    linux-vdso.so.1 (0x00007ffea37d3000)
    librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fc7e9f8e000)
    libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fc7e9f88000)
    libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007fc7e9f6c000)
    libssl.so.1.1 => not found
    libcrypto.so.1.1 => not found
    libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fc7e9f47000)
    libre2.so.5 => not found
    libb64.so.0d => not found
    libtritonserver.so => not found
    libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007fc7e9d65000)
    libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fc7e9c16000)
    libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fc7e9bf9000)
    libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fc7e9a07000)
    /lib64/ld-linux-x86-64.so.2 (0x00007fc7ea9c5000)
similarly can't find supporting files for onnxruntime and python

cmd : root@ffd5ccae94a8:/opt# find / -name onnx
/workspace/docs/examples/model_repository/densenet_onnx
/workspace/deploy/mlflow-triton-plugin/examples/onnx_float32_int32_int32
/workspace/deploy/mlflow-triton-plugin/examples/onnx_float32_int32_int32/1/model.onnx
/workspace/tritonbuild/python/examples/preprocessing/onnx_exporter.py
/workspace/qa/L0_onnx_optimization
/workspace/qa/L0_custom_ops/onnx_op_test.py
/workspace/qa/L0_custom_ops/custom_op_test.onnx
/workspace/qa/L0_model_config/autofill_noplatform_success/onnx
/workspace/qa/L0_model_config/autofill_noplatform_success/onnx/no_config/1/model.onnx
/workspace/qa/L0_model_config/autofill_noplatform_success/onnx/no_config_no_batch/1/model.onnx
/workspace/qa/L0_model_config/autofill_noplatform_success/onnx/empty_config/1/model.onnx
/workspace/qa/L0_model_config/autofill_noplatform/onnx
/workspace/qa/L0_model_config/autofill_noplatform/onnx/bad_max_batch_size/1/model.onnx
/workspace/qa/L0_model_config/autofill_noplatform/onnx/too_few_inputs/1/model.onnx
/workspace/qa/L0_model_config/autofill_noplatform/onnx/unknown_input/1/model.onnx
/workspace/qa/L0_model_config/autofill_noplatform/onnx/unknown_output/1/model.onnx
/workspace/qa/L0_model_config/autofill_noplatform/onnx/bad_output_dims/1/model.onnx
/workspace/qa/L0_model_config/autofill_noplatform/onnx/too_many_inputs/1/model.onnx
/workspace/qa/L0_model_config/autofill_noplatform/onnx/bad_input_dims/1/model.onnx
/workspace/qa/L0_java_resnet/expected_output_data/expected_output_onnx.txt
1 Like
  1. from the output of “ldd tritonserver”, you need to do some installment, please refer to https://github.com/triton-inference-server/server/blob/main/build.py#L844
  1. if need onnxruntime backend, you need to add “–backend=onnxruntime” in “python3 build.py”
  2. why do you need to build to build triton server? why not use triton docker? here is the docker link: Triton Inference Server | NVIDIA NGC
1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.