January 27, 2023, 12:29am
Hi! I’m trying to use tensorrt c++ in docker on Jetson Nano. This is my dockerfile:
ENV DEBIAN_FRONTEND noninteractive
ADD . .
RUN apt-get update && apt-get install -y \
gcc clang \
RUN mkdir build && cd build && cmake .. && make
And I have this error when I build my app:
In file included from /app/main.cpp:4:
/app/src/TRTInference.h:8:10: fatal error: 'NvInfer.h' file not found
For newer TensorRT versions, there is a development version of the Docker container (e.g. r220.127.116.11-devel).
Maybe you’ll have more luck starting with the l4t-ml container?
@vovinsa, on JetPack 4.x, you can set your default docker runtime to “nvidia”, and when you are building your dockerfile it will mount the TensorRT development headers from your host device (assuming you have TensorRT installed on your Nano)
Then when you run your container, start it with
February 21, 2023, 9:27am
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.