Because it is not found in your working path or your PYTHONPATH by export PYTHONPATH=$PYTHONPATH:/app/tao/tao_deploy/nvidia_tao_deploy
.
You can check nvidia_tao_deploy.cv.grounding_dino.engine_builder
and confirm it is available.
Also, if needed, please note that you can modify the existing 5.5 code to make it work in 5.0 deploy docker.
BTW, for DINO inference in Jetson device, one existing way is to use deepstream_tao_apps github. The PeopleNet Transformer is actually trained by DINO. Config file can be found in
deepstream_tao_apps/configs/nvinfer/peoplenet_transformer_tao/pgie_peoplenet_transformer_tao_config.txt at master · NVIDIA-AI-IOT/deepstream_tao_apps · GitHub.
Doc is in Deploying to DeepStream for DINO - NVIDIA Docs.