How to organize a COCO dataset

It’s not clear how to run a fasterrcnn training on a coco dataset.
The only thing that is clear from the documentation, is that one can train with a coco dataset.

I will use the kitti_config specs to clarify the question:
kitti_config {
root_directory_path: “/workspace/tao-experiments/data/training”
image_dir_name: “image_2”
label_dir_name: “label_2”
image_extension: “.png”
partition_mode: “random”
num_partitions: 2
val_split: 14
num_shards: 10

Does it work the same for coco?
How can I make it work with a coco annotation file that has information about image names (not pathes)?
Can I have all images in one folder and the annotations in another folder without having a train/test/val subdirectory? and TAO picks randomly according to the test split?

You can refer to DetectNet_v2 - NVIDIA Docs

coco_config {
  root_directory_path: "/workspace/tao-experiments/data/coco"
  img_dir_names: ["val2017", "train2017"]
  annotation_files: ["annotations/instances_val2017.json", "annotations/instances_train2017.json"]
  num_partitions: 2
  num_shards: [32, 256]
image_directory_path: "/workspace/tao-experiments/data/coco"

I am having some troubles running a data conversion:
!tao model detectnet_v2 dataset_convert -d $LOCAL_SPECS_DIR/data_convert_specs.txt -o output

I get an error:

However, running cat from the same notebook shows that the file does exit!
!cat $LOCAL_SPECS_DIR/data_convert_specs.txt

coco_config {
  root_directory_path: "'/mnt/dpdata/RumexData/_CURATED_DATA_SETS/20220901_HaldenNord_S_10_F_50_O_sama_ID1/"
  img_dir_names: ["1_images/Chunks", "1_images/Chunks"], "1_images/Chunks"]
  annotation_files: ["3_annotations/train_chuncks.json", "3_annotations/val_chuncks.json", "3_annotations/test_chuncks.json"]
  num_partitions: 2
  num_shards: [10, 10, 10]
image_directory_path: "'/mnt/dpdata/RumexData/_CURATED_DATA_SETS/20220901_HaldenNord_S_10_F_50_O_sama_ID1/"

My Tao mount is the following:

    "Mounts": [
            "source": "/home/naro/tao_exp",
            "destination": "/workspace/tao-experiments"
            "source": "/root/rumex/specs",
            "destination": "/workspace/tao-experiments/detectnet/specs"
    "Envs": [
            "variable": "CUDA_VISIBLE_DEVICES",
            "value": "0"

According to the ~/.tao_mounts.json file, the $LOCAL_SPECS_DIR/data_convert_specs.txt in the command line should be modified to /workspace/tao-experiments/detectnet/specs/data_convert_specs.txt.

Thanks. It works now.

It also made the role of ~/.tao_mounts.json clearer for me.

Firstly, I thought that the paths-arguments given in a tao commands are those that corresponds to directories in the local file system , not those inside the docker container.

However, it seems that the tao commands treats the paths as if they are inside the docker, and then the tao_mounts.json plays the lookup role for the docker to fetch the source that corresponds to a destination for fetching what is needed (specs, data, …).

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.