Using tao detectnet_v2 returns python/json argument errors in functions

Please provide the following information when requesting support.

• Hardware - Xavier NX
• Network Type (Detectnet_v2, Yolo_v4, and LPRnet
• Tao toolkit - 3.21.11 , docker_registry: nvcr.io

When trying to use the dataset_convert tool and the training tool the process returns an error that would make it seem as if there were arguments missing somewhere in the process.

The command I’m attempting to execute:
tao detectnet_v2 dataset_convert -d /workspace/openalpr/SPECS_tfrecord.txt -o /workspace/openalpr/lpd_tfrecord/lpd

output:
tao detectnet_v2 dataset_convert -d /workspace/openalpr/SPECS_tfrecord.txt -o /workspace/openalpr/lpd_tfrecord/lpd
2021-12-23 18:33:43,397 [INFO] root: Registry: [‘nvcr.io’]
2021-12-23 18:33:43,744 [INFO] tlt.components.instance_handler.local_instance: Running command in container: nvcr.io/nvidia/tao/tao-toolkit-tf:v3.21.11-tf1.15.4-py3
Traceback (most recent call last):
File “/home/vetted/.local/bin/tao”, line 8, in
sys.exit(main())
File “/home/vetted/.local/lib/python3.6/site-packages/tlt/entrypoint/entrypoint.py”, line 115, in main
args[1:]
File “/home/vetted/.local/lib/python3.6/site-packages/tlt/components/instance_handler/local_instance.py”, line 319, in launch_command
docker_handler.run_container(command)
File “/home/vetted/.local/lib/python3.6/site-packages/tlt/components/docker_handler/docker_handler.py”, line 284, in run_container
mount_data, env_vars, docker_options = self._get_mount_env_data()
File “/home/vetted/.local/lib/python3.6/site-packages/tlt/components/docker_handler/docker_handler.py”, line 92, in _get_mount_env_data
data = self._load_mounts_file(self._docker_mount_file)
File “/home/vetted/.local/lib/python3.6/site-packages/tlt/components/docker_handler/docker_handler.py”, line 77, in _load_mounts_file
data = json.load(mfile)
File “/usr/lib/python3.6/json/init.py”, line 299, in load
parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw)
File “/usr/lib/python3.6/json/init.py”, line 354, in loads
return _default_decoder.decode(s)
File “/usr/lib/python3.6/json/decoder.py”, line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File “/usr/lib/python3.6/json/decoder.py”, line 355, in raw_decode
obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Expecting ‘,’ delimiter: line 6 column 7 (char 127)

I hope I’m simply missing something and this is an easy fix.
this is the guide I’m going off of:
Creating a Real-Time License Plate Detection and Recognition App | NVIDIA Developer Blog

thanks, any help is appreciated.

Did you create ~/.tao_mounts.json ?
If yes, please share it.

{

    "Mounts": [

        {

         "source": "/home/vetted/tao-experiments",

         "destination": "/workspace/tao-experiments"

        {

        {

        "source": "/home/vetted/openalpr",

            "destination": "/workspace/openalpr"                                  }

]

}

The source directories exist but are empty. I couldn’t find a resource that explicitly said what to populate the directory with.

The source directories should be your local directories.
The .json file is used to map your local directories to docker container’s directories.

See TAO Toolkit Launcher - NVIDIA Docs for more info.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.