When i try to run tlt detectnet_v2 dataset_convert
i have this error:
FileNotFoundError: [Errno 2] No such file or directory: '/workspace/tlt-experiments/detectnet_v2/specs/detectnet_v2_tfrecords_kitti_trainval.txt'
But the file exist in the folder
Configuration:
To run docker i use:
sudo docker run --gpus all -it -v /tmp/.X11-unix:/tmp/.X11-unix -v /var/run/docker.sock:/var/run/docker.sock -v /TLT:/tlt -e DISPLAY=$DISPLAY --network host nvcr.io/nvidia/tlt-streamanalytics:v3.0-py3
On /tlt i have this 2 folders:
/tlt_cv_samples_vv1.0.2
/workspace
I run:
nohup jupyter notebook --ip 0.0.0.0 --port 8888 --allow-root
im using the /tlt_cv_samples_vv1.0.2/detectnet_v2/detectnet_v2.ipynb
Env:
# Setting up env variables for cleaner command line commands.
import os
%env KEY=tlt_encode
%env NUM_GPUS=1
%env USER_EXPERIMENT_DIR=/workspace/tlt-experiments/detectnet_v2
%env DATA_DOWNLOAD_DIR=/workspace/tlt-experiments/data
%env LOCAL_PROJECT_DIR=/tlt/workspace/tlt-experiments
# Set this path if you don't run the notebook from the samples directory.
%env NOTEBOOK_ROOT=/tlt/tlt_cv_samples_vv1.0.2/detectnet_v2
# Please define this local project directory that needs to be mapped to the TLT docker session.
# The dataset expected to be present in $LOCAL_PROJECT_DIR/data, while the results for the steps
# in this notebook will be stored at $LOCAL_PROJECT_DIR/detectnet_v2
# !PLEASE MAKE SURE TO UPDATE THIS PATH!.
#os.environ["LOCAL_PROJECT_DIR"] = FIXME
os.environ["LOCAL_DATA_DIR"] = os.path.join(
os.getenv("LOCAL_PROJECT_DIR", os.getcwd()),
"data"
)
os.environ["LOCAL_EXPERIMENT_DIR"] = os.path.join(
os.getenv("LOCAL_PROJECT_DIR", os.getcwd()),
"detectnet_v2"
)
# The sample spec files are present in the same path as the downloaded samples.
os.environ["LOCAL_SPECS_DIR"] = os.path.join(
os.getenv("NOTEBOOK_ROOT", os.getcwd()),
"specs"
)
%env SPECS_DIR=/workspace/tlt-experiments/detectnet_v2/specs
# Showing list of specification files.
!ls -rlt $LOCAL_SPECS_DIR
!ls -rlt $LOCAL_EXPERIMENT_DIR
Mount:
# Mapping up the local directories to the TLT docker.
import json
mounts_file = os.path.expanduser("~/.tlt_mounts.json")
# Define the dictionary with the mapped drives
drive_map = {
"Mounts": [
# Mapping the data directory
{
"source": os.environ["LOCAL_PROJECT_DIR"],
"destination": "/workspace/tlt-experiments"
},
# Mapping the specs directory.
{
"source": os.environ["LOCAL_SPECS_DIR"],
"destination": os.environ["SPECS_DIR"]
},
]
}
# Writing the mounts file.
with open(mounts_file, "w") as mfile:
json.dump(drive_map, mfile, indent=4)
!tlt info
Configuration of the TLT Instance
dockers: ['nvidia/tlt-streamanalytics', 'nvidia/tlt-pytorch']
format_version: 1.0
tlt_version: 3.0
published_date: 04/16/2021
print("TFrecords conversion spec file for kitti training")
!cat $LOCAL_SPECS_DIR/detectnet_v2_tfrecords_kitti_trainval.txt
TFrecords conversion spec file for kitti training
kitti_config {
root_directory_path: "/workspace/tlt-experiments/data/training"
image_dir_name: "uimages"
label_dir_name: "ulabel"
image_extension: ".png"
partition_mode: "random"
num_partitions: 2
val_split: 14
num_shards: 10
}
image_directory_path: "/workspace/tlt-experiments/data/training"
# Creating a new directory for the output tfrecords dump.
print("Converting Tfrecords for kitti trainval dataset")
!tlt detectnet_v2 dataset_convert \
-d $SPECS_DIR/detectnet_v2_tfrecords_kitti_trainval.txt \
-o $DATA_DOWNLOAD_DIR/tfrecords/kitti_trainval/kitti_trainval
Converting Tfrecords for kitti trainval dataset
2021-06-28 17:44:22,351 [INFO] root: Registry: ['nvcr.io']
2021-06-28 17:44:22,406 [WARNING] tlt.components.docker_handler.docker_handler:
Docker will run the commands as root. If you would like to retain your
local host permissions, please add the "user":"UID:GID" in the
DockerOptions portion of the ~/.tlt_mounts.json file. You can obtain your
users UID and GID by using the "id -u" and "id -g" commands on the
terminal.
Using TensorFlow backend.
WARNING:tensorflow:Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them.
Using TensorFlow backend.
Traceback (most recent call last):
File "/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/dataset_convert.py", line 104, in <module>
File "/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/dataset_convert.py", line 93, in <module>
File "/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/dataset_convert.py", line 84, in main
FileNotFoundError: [Errno 2] No such file or directory: '/workspace/tlt-experiments/detectnet_v2/specs/detectnet_v2_tfrecords_kitti_trainval.txt'
2021-06-28 17:44:27,122 [INFO] tlt.components.docker_handler.docker_handler: Stopping container.