TAO Toolkit: Problem with dataset conversion using COCO format

Please provide the following information when requesting support.

• Hardware (NVIDIA GeForce GTX 1650)
• Network Type (Detectnet_v2)
• TLT Version (nvcr.io/nvidia/tao/tao-toolkit:4.0.0-tf1.15.5 )

I am trying to convert the COCO dataset to TFRecords using the following command (inside the container):

detectnet_v2 dataset_convert -d convert_params.txt -o ouput_tf 

The file convert_params.txt contents is:

coco_config {
  root_directory_path: "/local_dir/BACKUP/COCO_DATASET/raw-data"
  img_dir_names: ["val2017", "train2017"]
  annotation_files: ["annotations/instances_val2017.json", "annotations/instances_train2017.json"]
  num_partitions: 2
  num_shards: [32, 256]
}
image_directory_path: "/local_dir/BACKUP/COCO_DATASET/raw-data"

All the paths are valid inside the container!. The problem is when I exec the command I got this:

root@0ce4d6419a27:/local_dir# detectnet_v2 dataset_convert -d convert_params.txt -o ouput_tf 

Using TensorFlow backend.
2023-01-26 03:03:19.047572: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.11.0
WARNING:tensorflow:Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them.
/usr/local/lib/python3.6/dist-packages/requests/__init__.py:91: RequestsDependencyWarning: urllib3 (1.26.5) or chardet (3.0.4) doesn't match a supported version!
  RequestsDependencyWarning)
WARNING:tensorflow:Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them.
/usr/local/lib/python3.6/dist-packages/requests/__init__.py:91: RequestsDependencyWarning: urllib3 (1.26.5) or chardet (3.0.4) doesn't match a supported version!
  RequestsDependencyWarning)
Using TensorFlow backend.
2023-01-26 03:03:24,564 [INFO] iva.detectnet_v2.dataio.build_converter: Instantiating a coco converter
2023-01-26 03:03:24,564 [INFO] iva.detectnet_v2.dataio.dataset_converter_lib: Creating output directory 
Traceback (most recent call last):
  File "</usr/local/lib/python3.6/dist-packages/iva/detectnet_v2/scripts/dataset_convert.py>", line 3, in <module>
  File "<frozen iva.detectnet_v2.scripts.dataset_convert>", line 135, in <module>
  File "<frozen iva.detectnet_v2.scripts.dataset_convert>", line 124, in <module>
  File "<frozen iva.detectnet_v2.scripts.dataset_convert>", line 118, in main
  File "<frozen iva.detectnet_v2.dataio.build_converter>", line 122, in build_converter
  File "<frozen iva.detectnet_v2.dataio.coco_converter_lib>", line 54, in __init__
  File "<frozen iva.detectnet_v2.dataio.dataset_converter_lib>", line 60, in __init__
  File "/usr/lib/python3.6/os.py", line 220, in makedirs
    mkdir(name, mode)
FileNotFoundError: [Errno 2] No such file or directory: ''
Telemetry data couldn't be sent, but the command ran successfully.
[WARNING]: <urlopen error [Errno -2] Name or service not known>
Execution status: FAIL
root@0ce4d6419a27:/local_dir# 

I don’t know if I am missing something, here (DetectNet_v2 — TAO Toolkit 4.0 documentation) says that I can use spec file for COCO format and there is an example of the spec file, however all the examples of dataset conversion are related with Kitti

Please refer to Tao unet inference would stop at at around 80% of the process and no output result - #9 by Morganh

May I know the latest result? I find the you latest comment is hidden.

Sorry, I don’t know why the comment is hidden. But I will close the issue, I added the “target_class_mapping” for every category and it just worked! Thank you again

1 Like