Error: Ground truth kitti labels should only have 15 fields

We are using the TLT sample yolo notebook to retrain the model with our custom dataset. We converted our dataset to the KITTI format, referencing this page. We then ran this command to generate tfrecords:

!tlt-dataset-convert -d $SPECS_DIR/yolo_tfrecords_kitti_trainval.txt -o $DATA_DOWNLOAD_DIR/tfrecords/kitti_trainval/kitti_trainval

And then got this error:

2021-02-02 17:49:41.748521: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
Using TensorFlow backend.
2021-02-02 17:49:44,461 - iva.detectnet_v2.dataio.build_converter - INFO - Instantiating a kitti converter
2021-02-02 17:49:44,462 - iva.detectnet_v2.dataio.dataset_converter_lib - INFO - Creating output directory /workspace/maskrcnn_experiment/yolo/Shoe_Data/tfrecords/kitti_trainval
2021-02-02 17:49:44,467 - iva.detectnet_v2.dataio.kitti_converter_lib - INFO - Num images in
Train: 114	Val: 18
2021-02-02 17:49:44,467 - iva.detectnet_v2.dataio.kitti_converter_lib - INFO - Validation data in partition 0. Hence, while choosing the validationset during training choose validation_fold 0.
2021-02-02 17:49:44,467 - iva.detectnet_v2.dataio.dataset_converter_lib - INFO - Writing partition 0, shard 0
WARNING:tensorflow:From /home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/dataio/dataset_converter_lib.py:142: The name tf.python_io.TFRecordWriter is deprecated. Please use tf.io.TFRecordWriter instead.

2021-02-02 17:49:44,467 - tensorflow - WARNING - From /home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/dataio/dataset_converter_lib.py:142: The name tf.python_io.TFRecordWriter is deprecated. Please use tf.io.TFRecordWriter instead.

/usr/local/lib/python3.6/dist-packages/iva/detectnet_v2/dataio/kitti_converter_lib.py:273: VisibleDeprecationWarning: Reading unicode strings without specifying the encoding argument is deprecated. Set the encoding, use None for the system default.
2021-02-02 17:49:44,482 - iva.detectnet_v2.dataio.dataset_converter_lib - INFO - Writing partition 0, shard 1
2021-02-02 17:49:44,488 - iva.detectnet_v2.dataio.dataset_converter_lib - INFO - Writing partition 0, shard 2
2021-02-02 17:49:44,493 - iva.detectnet_v2.dataio.dataset_converter_lib - INFO - Writing partition 0, shard 3
2021-02-02 17:49:44,498 - iva.detectnet_v2.dataio.dataset_converter_lib - INFO - Writing partition 0, shard 4
Traceback (most recent call last):
  File "/usr/local/bin/tlt-dataset-convert", line 8, in <module>
    sys.exit(main())
  File "/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/dataset_convert.py", line 64, in main
  File "/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/dataio/dataset_converter_lib.py", line 71, in convert
  File "/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/dataio/dataset_converter_lib.py", line 105, in _write_partitions
  File "/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/dataio/dataset_converter_lib.py", line 146, in _write_shard
  File "/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/dataio/kitti_converter_lib.py", line 176, in _create_example_proto
  File "/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/dataio/kitti_converter_lib.py", line 279, in _add_targets
AssertionError: Ground truth kitti labels should have only 15 fields.

However, the text files are in this format (following the format included in the Nvidia link above) and they do only have 15 fields:

Air_Force_1 0.00 0 0.0 105.00 128.00 683.00 440.00 0.0 0.0 0.0 0.0 0.0 0.0 0.0

We also verified this was the case for all the files in the label_2 directory with a python script that looped through the files in the directory, split the text into a list/array, and printed the length, which was 15.

Would appreciate any advice on how to solve this issue.

Please double check all of your label files to make sure each of them only have 15 fields.

Thanks for your response. Do you have a recommendation for how we can check all the files have 15 fields? This is the code we are currently using:

import os, re
rootdir = "/workspace/maskrcnn_experiment/yolo/Shoe_Data/training/label_2"
for subdir, dirs, files in os.walk(rootdir) : 
    for file in files : 
        if file.endswith('.txt') :
            dir = os.path.join(subdir, file)
            data = open(dir,'r').read().replace('\n', '')
            x = re.split(" ", data)
            print(x)
            print(len(x))

Please run below to check each line of your label files. Make sure there is not empty line too.

import glob

rootdir = "./label"

for i in glob.glob(rootdir + "/*.txt"):
    with open(i, 'r') as j:
         for line in j.readlines():
             label = line.strip()
             length = len(label.split(" "))
             print("This label have {} fields".format(length))
             assert length == 15, 'Ground truth kitti labels should have only 15 fields. And make sure there is not empty lines. Please check the label in the file %s' % (j)

That worked - thanks so much!