OpenCL Executable will not run in Runtime Container

I am trying to build a container to run AutoDock-Vina-GPU-2.1. I can compile and run the code inside the HPC SDK container (docker run -it nvcr.io/nvidia/nvhpc:23.1-devel-cuda12.0-ubuntu22.04 or docker build -t test:vina-gpu -f Docerfile .), but it will not run as a container. I get the following output from docker run -it test:vina-gpu:

#################################################################
# If you used AutoDockVina-GPU 2.1 in your work, please cite:   #
#                                                               #
# Ding, Ji, et al. Vina-GPU 2.0: Further Accelerating AutoDock  #
# Vina and Its Derivatives with Graphics Processing Units.      #
# Journal of Chemical Information and Modeling (2023).          #
#                                                               #
# DOI https://doi.org/10.1021/acs.jcim.2c01504                  #
#                                                               #
# Shidi, Tang, Chen Ruiqi, Lin Mengru, Lin Qingde,              #
# Zhu Yanxiang, Wu Jiansheng, Hu Haifeng, and Ling Ming.        #
# Accelerating AutoDock Vina with GPUs.                         #
# Molecules 27.9 (2022): 3041.                                  #
#                                                               #
# DOI https://doi.org/10.3390/molecules27093041                 #
#                                                               #
# And also the origin AutoDock Vina paper:                      #
# O. Trott, A. J. Olson,                                        #
# AutoDock Vina: improving the speed and accuracy of docking    #
# with a new scoring function, efficient optimization and       #
# multithreading, Journal of Computational Chemistry 31 (2010)  #
# 455-461                                                       #
#                                                               #
# DOI 10.1002/jcc.21334                                         #
#                                                               #
#################################################################

Using virtual sreening mode

Output will be in the directory ./test_out
Reading input ... done.
Setting up the scoring function ... done.
Using heuristic search_depth
Analyzing the binding site ... done.

Err-1001:CL_PLATFORM_NOT_FOUND_KHR

I have installed the NVIDIA GPU Operator and the NVIDIA Device Plugin for Kubernetes. I don’t think I’m using those inside the Docker container, though. Here’s what my Dockerfile looks like:

# Build Stage
FROM nvcr.io/nvidia/nvhpc:23.1-devel-cuda12.0-ubuntu22.04 AS build

# Define and use build arguments
ARG HPCSDK_VERSION=23.1

# Set the NVHPC environment variables per the NVIDIA modulefile
ENV nvhome /opt/nvidia/hpc_sdk
ENV target Linux_x86_64
ENV version ${HPCSDK_VERSION}
ENV cuda_version 12

# Define derived paths using the base variables
ENV nvcudadir ${nvhome}/${target}/${version}/cuda
ENV nvcompdir ${nvhome}/${target}/${version}/compilers
ENV nvmathdir ${nvhome}/${target}/${version}/math_libs
ENV nvcommdir ${nvhome}/${target}/${version}/comm_libs

# Set environment variables related to the NVIDIA HPC SDK
ENV NVHPC ${nvhome}
ENV NVHPC_ROOT ${nvhome}/${target}/${version}
ENV CC ${nvcompdir}/bin/nvc
ENV CXX ${nvcompdir}/bin/nvc++
ENV FC ${nvcompdir}/bin/nvfortran
ENV F90 ${nvcompdir}/bin/nvfortran
ENV F77 ${nvcompdir}/bin/nvfortran
ENV CPP cpp

# Prepend paths to the PATH environment variable
ENV PATH ${nvcudadir}/bin:${nvcompdir}/bin:${nvcompdir}/extras/qd/bin:$PATH

# Prepend paths to the LD_LIBRARY_PATH environment variable
ENV LD_LIBRARY_PATH ${nvcudadir}/lib64:${nvcudadir}/extras/CUPTI/lib64:${nvcompdir}/extras/qd/lib:${nvcompdir}/lib:${nvmathdir}/lib64:${nvcommdir}/nccl/lib:${nvcommdir}/nvshmem/lib:$LD_LIBRARY_PATH

# Prepend paths to the CPATH environment variable
ENV CPATH ${nvmathdir}/include:${nvcommdir}/nccl/include:${nvcommdir}/nvshmem/include:${nvcompdir}/extras/qd/include/qd:$CPATH

# Install the boost libraries
RUN apt-get update && \
    apt-get install -y libboost-all-dev

# Build AutoDock GPU versions
# - see https://github.com/enko-chem/Vina-GPU-2.1
ENV MAKE_OPTS="OPENCL_VERSION=3"

RUN mkdir /source && \
    git clone https://github.com/enko-chem/Vina-GPU-2.1.git /source/Vina-GPU-2.1

# Compile AutoDock with correct environment variables using bash -c to ensure they expand
RUN cd /source/Vina-GPU-2.1/AutoDock-Vina-GPU-2.1 && \
    git checkout debug-build && \
    bash -c "make ${MAKE_OPTS} source && make install"

I tried installing pocl-opencl-icd and nvidia-settings to get it working, but no luck. I confirmed that /dev/nvidia0 is there and that nvidia-smi works, so I know the container is finding the hardware. All of the libraries are there:

root@8a5c2289297b:/dev# ldd /opt/AutoDock-Vina-GPU/AutoDock-Vina-GPU-2-1 
        linux-vdso.so.1 (0x00007ffcf71b6000)
        libboost_program_options.so.1.74.0 => /usr/lib/x86_64-linux-gnu/libboost_program_options.so.1.74.0 (0x000075aced0a9000)
        libboost_system.so.1.74.0 => /usr/lib/x86_64-linux-gnu/libboost_system.so.1.74.0 (0x000075aced0a4000)
        libboost_filesystem.so.1.74.0 => /usr/lib/x86_64-linux-gnu/libboost_filesystem.so.1.74.0 (0x000075aced084000)
        libboost_thread.so.1.74.0 => /usr/lib/x86_64-linux-gnu/libboost_thread.so.1.74.0 (0x000075aced060000)
        libOpenCL.so.1 => /usr/lib/x86_64-linux-gnu/libOpenCL.so.1 (0x000075acecc00000)
        libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x000075acece34000)
        libm.so.6 => /usr/lib/x86_64-linux-gnu/libm.so.6 (0x000075acecb19000)
        libatomic.so.1 => /usr/lib/x86_64-linux-gnu/libatomic.so.1 (0x000075acece2a000)
        libnvhpcatm.so => /opt/nvidia/hpc_sdk/Linux_x86_64/23.1/compilers/lib/libnvhpcatm.so (0x000075acec600000)
        libnvomp.so => /opt/nvidia/hpc_sdk/Linux_x86_64/23.1/compilers/lib/libnvomp.so (0x000075aceb600000)
        libnvcpumath.so => /opt/nvidia/hpc_sdk/Linux_x86_64/23.1/compilers/lib/libnvcpumath.so (0x000075aceae00000)
        libnvc.so => /opt/nvidia/hpc_sdk/Linux_x86_64/23.1/compilers/lib/libnvc.so (0x000075acea800000)
        libc.so.6 => /usr/lib/x86_64-linux-gnu/libc.so.6 (0x000075acec8f0000)
        libgcc_s.so.1 => /usr/lib/x86_64-linux-gnu/libgcc_s.so.1 (0x000075acece08000)
        libdl.so.2 => /usr/lib/x86_64-linux-gnu/libdl.so.2 (0x000075acec8e9000)
        libpthread.so.0 => /usr/lib/x86_64-linux-gnu/libpthread.so.0 (0x000075acec8e4000)
        /lib64/ld-linux-x86-64.so.2 (0x000075aced24d000)