@Morganh
Thank you for your reply and advice.
I am sorry, I made a mistake.
It was not classification_tf2
but detectnet_v2
.
So, here is what I did and what I encountered an error.
I followed the notebook of detectnet_v2.ipynb and run several cells.
# I only changed this one.
os.environ["LOCAL_PROJECT_DIR"] = "/home/ym7/tao-jupyter/getting_started_v5.3.0/notebooks/tao_launcher_starter_kit/detectnet_v2_test"
But when I run following this, a permission error occured.
!tao model detectnet_v2 dataset_convert -d $SPECS_DIR/spec_tfrecords_kitti.txt \
-o $LOCAL_DATA_DIR/tfrecords_aikata
2024-06-01 23:35:09,257 [TAO Toolkit] [INFO] root 160: Registry: ['nvcr.io']
2024-06-01 23:35:09,318 [TAO Toolkit] [INFO] nvidia_tao_cli.components.instance_handler.local_instance 360: Running command in container: nvcr.io/nvidia/tao/tao-toolkit:5.0.0-tf1.15.5
2024-06-01 23:35:09,549 [TAO Toolkit] [INFO] nvidia_tao_cli.components.docker_handler.docker_handler 301: Printing tty value True
2024-06-01 14:35:10.459957: I tensorflow/stream_executor/platform/default/dso_loader.cc:50] Successfully opened dynamic library libcudart.so.12
2024-06-01 14:35:10,495 [TAO Toolkit] [WARNING] tensorflow 40: Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them.
Using TensorFlow backend.
2024-06-01 14:35:11,562 [TAO Toolkit] [WARNING] tensorflow 43: TensorFlow will not use sklearn by default. This improves performance in some cases. To enable sklearn export the environment variable TF_ALLOW_IOLIBS=1.
2024-06-01 14:35:11,591 [TAO Toolkit] [WARNING] tensorflow 42: TensorFlow will not use Dask by default. This improves performance in some cases. To enable Dask export the environment variable TF_ALLOW_IOLIBS=1.
2024-06-01 14:35:11,595 [TAO Toolkit] [WARNING] tensorflow 43: TensorFlow will not use Pandas by default. This improves performance in some cases. To enable Pandas export the environment variable TF_ALLOW_IOLIBS=1.
2024-06-01 14:35:12,635 [TAO Toolkit] [WARNING] matplotlib 500: Matplotlib created a temporary config/cache directory at /tmp/matplotlib-fbqv3xf1 because the default path (/.config/matplotlib) is not a writable directory; it is highly recommended to set the MPLCONFIGDIR environment variable to a writable directory, in particular to speed up the import of Matplotlib and to better support multiprocessing.
2024-06-01 14:35:12,807 [TAO Toolkit] [INFO] matplotlib.font_manager 1633: generated new fontManager
WARNING:tensorflow:Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them.
Using TensorFlow backend.
WARNING:tensorflow:TensorFlow will not use sklearn by default. This improves performance in some cases. To enable sklearn export the environment variable TF_ALLOW_IOLIBS=1.
2024-06-01 14:35:14,027 [TAO Toolkit] [WARNING] tensorflow 43: TensorFlow will not use sklearn by default. This improves performance in some cases. To enable sklearn export the environment variable TF_ALLOW_IOLIBS=1.
WARNING:tensorflow:TensorFlow will not use Dask by default. This improves performance in some cases. To enable Dask export the environment variable TF_ALLOW_IOLIBS=1.
2024-06-01 14:35:14,053 [TAO Toolkit] [WARNING] tensorflow 42: TensorFlow will not use Dask by default. This improves performance in some cases. To enable Dask export the environment variable TF_ALLOW_IOLIBS=1.
WARNING:tensorflow:TensorFlow will not use Pandas by default. This improves performance in some cases. To enable Pandas export the environment variable TF_ALLOW_IOLIBS=1.
2024-06-01 14:35:14,056 [TAO Toolkit] [WARNING] tensorflow 43: TensorFlow will not use Pandas by default. This improves performance in some cases. To enable Pandas export the environment variable TF_ALLOW_IOLIBS=1.
2024-06-01 14:35:14,415 [TAO Toolkit] [INFO] nvidia_tao_tf1.cv.detectnet_v2.dataio.build_converter 87: Instantiating a kitti converter
2024-06-01 14:35:14,416 [TAO Toolkit] [INFO] nvidia_tao_tf1.cv.detectnet_v2.dataio.dataset_converter_lib 71: Creating output directory /home/ym7/tao-jupyter/getting_started_v5.3.0/notebooks/tao_launcher_starter_kit/detectnet_v2_test/data
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/detectnet_v2/scripts/dataset_convert.py", line 168, in <module>
raise e
File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/detectnet_v2/scripts/dataset_convert.py", line 137, in <module>
main()
File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/detectnet_v2/scripts/dataset_convert.py", line 131, in main
converter = build_converter(dataset_export_config, args.output_filename, args.validation_fold)
File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/detectnet_v2/dataio/build_converter.py", line 91, in build_converter
converter = KITTIConverter(**constructor_kwargs)
File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/detectnet_v2/dataio/kitti_converter_lib.py", line 87, in __init__
super(KITTIConverter, self).__init__(
File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/detectnet_v2/dataio/dataset_converter_lib.py", line 72, in __init__
os.makedirs(output_dir)
File "/usr/lib/python3.8/os.py", line 213, in makedirs
makedirs(head, exist_ok=exist_ok)
File "/usr/lib/python3.8/os.py", line 213, in makedirs
makedirs(head, exist_ok=exist_ok)
File "/usr/lib/python3.8/os.py", line 213, in makedirs
makedirs(head, exist_ok=exist_ok)
[Previous line repeated 3 more times]
File "/usr/lib/python3.8/os.py", line 223, in makedirs
mkdir(name, mode)
PermissionError: [Errno 13] Permission denied: '/home/ym7'
Execution status: FAIL
2024-06-01 23:35:18,806 [TAO Toolkit] [INFO] nvidia_tao_cli.components.docker_handler.docker_handler 363: Stopping container.
How can I solve this Permission Error?
I think I followed the document of DetectNet_V2, but is what I tried something wrong?
【FYI】
Terminal
$ pwd
/home/ym7/tao-jupyter/getting_started_v5.3.0/notebooks/tao_launcher_starter_kit/detectnet_v2_test
$ ls -lh
drwxrwxrwx 9 ym7 ym7 4.0K Jun 1 22:34 data
-rw-rw-r-- 1 ym7 ym7 68M May 29 00:03 detectnet_v2_MyTest.ipynb
drwxrwxr-x 3 ym7 ym7 4.0K Jun 1 22:42 detectnet_v2_test
drwxrwxr-x 2 ym7 ym7 4.0K May 31 15:22 specs
...
$ ls -lh data
drwxrwxrwx 6 ym7 ym7 108K May 30 17:49 images_aikata
drwxrwxrwx 5 ym7 ym7 88K May 30 17:48 kitti_labels_aikata
drwxrwxrwx 2 ym7 ym7 4.0K Jun 1 22:34 tfrecords_aikata
...
$ ls -lh specs
-rw-rw-r-- 1 ym7 ym7 670 May 31 12:18 spec_tfrecords_kitti.txt
-rw-rw-r-- 1 ym7 ym7 9.7K May 31 15:22 spec_train_kitti.txt
...
$ jupyter notebook --ip 0.0.0.0 --port 8888 --allow-root
spec_tfrecords_kitti.txt
kitti_config {
root_directory_path: "/workspace/tao-experiments/data/training"
image_dir_name: "images_aikata"
label_dir_name: "labels_aikata"
image_extension: ".jpg"
partition_mode: "random"
num_partitions: 2
val_split: 20
num_shards: 10
}
image_directory_path: "/workspace/tao-experiments/data/training"
target_class_mapping {
key: "barcode"
value: "barcode"
}