docker.errors.ImageNotFound: 404 Client Error

Please provide the following information when requesting support.

• Hardware: GTX 1050
• Network Type: Detectnet_v2
• TLT Version:
dockers: [‘nvidia/tao/tao-toolkit-tf’, ‘nvidia/tao/tao-toolkit-pyt’, ‘nvidia/tao/tao-toolkit-lm’]
format_version: 2.0
toolkit_version: 3.21.11
published_date: 11/08/2021

Runnning " tao detectnet_v2 train --help" gives error:
2022-02-02 12:11:40,364 [INFO] root: Registry: [‘nvcr.io’]
2022-02-02 12:11:40,438 [INFO] tlt.components.instance_handler.local_instance: Running command in container: nvcr.io/nvidia/tao/tao-toolkit-tf:v3.21.11-tf1.15.4-py3
Traceback (most recent call last):
File “/home/karan/anaconda3/lib/python3.7/site-packages/docker/api/client.py”, line 259, in _raise_for_status
response.raise_for_status()
File “/home/karan/anaconda3/lib/python3.7/site-packages/requests/models.py”, line 941, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http+docker://localhost/v1.41/images/sha256:8adfc79b143d15eb724e64222220b12020afef639dec2354283d3fc7e8d3c6bc/json

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “/home/karan/anaconda3/bin/tao”, line 8, in
sys.exit(main())
File “/home/karan/anaconda3/lib/python3.7/site-packages/tlt/entrypoint/entrypoint.py”, line 115, in main
args[1:]
File “/home/karan/anaconda3/lib/python3.7/site-packages/tlt/components/instance_handler/local_instance.py”, line 319, in launch_command
docker_handler.run_container(command)
File “/home/karan/anaconda3/lib/python3.7/site-packages/tlt/components/docker_handler/docker_handler.py”, line 279, in run_container
if not self._check_image_exists():
File “/home/karan/anaconda3/lib/python3.7/site-packages/tlt/components/docker_handler/docker_handler.py”, line 135, in _check_image_exists
image_inspection_content = self._api_client.inspect_image(image.attrs[“Id”])
File “/home/karan/anaconda3/lib/python3.7/site-packages/docker/utils/decorators.py”, line 19, in wrapped
return f(self, resource_id, *args, **kwargs)
File “/home/karan/anaconda3/lib/python3.7/site-packages/docker/api/image.py”, line 246, in inspect_image
self._get(self._url(“/images/{0}/json”, image)), True
File “/home/karan/anaconda3/lib/python3.7/site-packages/docker/api/client.py”, line 265, in _result
self._raise_for_status(response)
File “/home/karan/anaconda3/lib/python3.7/site-packages/docker/api/client.py”, line 261, in _raise_for_status
raise create_api_error_from_http_exception(e)
File “/home/karan/anaconda3/lib/python3.7/site-packages/docker/errors.py”, line 31, in create_api_error_from_http_exception
raise cls(e, response=response, explanation=explanation)
docker.errors.ImageNotFound: 404 Client Error: Not Found (“no such image: sha256:8adfc79b143d15eb724e64222220b12020afef639dec2354283d3fc7e8d3c6bc: No such image: sha256:8adfc79b143d15eb724e64222220b12020afef639dec2354283d3fc7e8d3c6bc”)

Did you setup environment inside a docker? In other words, are you running docker inside a docker?

no

I set up everything on the laptop main OS.

Can you try below again?
tao detectnet_v2

BTW, which OS in your laptop?

I am using Ubuntu.
Same error with “tao detectnet_v2”

Have you logined to the NGC docker registry ( nvcr.io ) ?
$ docker login nvcr.io

  1. Username: “$oauthtoken”
  2. Password: “YOUR_NGC_API_KEY”

Similar topic: TLT mask rcnn error: Tlt.components.docker_handler.docker_handler: Stopping container - #20 by Morganh

yes

To narrow down, can you try to pull tao docker directly?
docker pull nvcr.io/nvidia/tao/tao-toolkit-tf:v3.21.11-tf1.15.4-py3

Then, try to login it
$ docker run --runtime -it --rm nvcr.io/nvidia/tao/tao-toolkit-tf:v3.21.11-tf1.15.4-py3 /bin/bash

What next?

Then,

detectnet_v2 train --help

and run training

detectnet_v2 train xxx

Using TensorFlow backend.
2022-02-04 11:55:56,269 [WARNING] modulus.export._tensorrt: Failed to import TRT and/or CUDA. TensorRT optimization and inference will not be available.
Traceback (most recent call last):
File “/usr/local/bin/detectnet_v2”, line 8, in
sys.exit(main())
File “/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/entrypoint/detectnet_v2.py”, line 12, in main
File “/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/entrypoint/entrypoint.py”, line 256, in launch_job
File “/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/entrypoint/entrypoint.py”, line 47, in get_modules
File “/usr/lib/python3.6/importlib/init.py”, line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File “”, line 994, in _gcd_import
File “”, line 971, in _find_and_load
File “”, line 955, in _find_and_load_unlocked
File “”, line 665, in _load_unlocked
File “”, line 678, in exec_module
File “”, line 219, in _call_with_frames_removed
File “/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/export.py”, line 8, in
File “/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/export/exporter.py”, line 12, in
File “/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/export/keras_exporter.py”, line 22, in
ImportError: cannot import name ‘ONNXEngineBuilder’

Please open a new terminal to check below on your laptop.
$ nvidia-smi
$ dpkg -l |grep cuda

it’s working. Thanks @Morganh
You are a star.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.