Please provide the following information when requesting support.
• Hardware (T4/V100/Xavier/Nano/etc): Laptop with GPU - GTX 1650
• Network Type (Detectnet_v2/Faster_rcnn/Yolo_v4/LPRnet/Mask_rcnn/Classification/etc): Unet
• TLT Version (Please run “tlt info --verbose” and share “docker_tag” here): toolkit_version: 4.0.1 and 4.0.0-tf1.15.5
• Training spec file(If have, please share here): None
• How to reproduce the issue ? (This is for errors. Please share the command line and the detailed log here.)
Hello, I am trying to run the Binary Semantic Segmentation using TAO UNET notebook from /unet/tao_isbi, but I have run into docker mapping issues. In section 2 of the notebook, I tried converting my own COCO dataset to unet format by using !tao unet dataset_convert -f [] -r []
, but I get no masks output and also get the following error
2023-03-07 18:33:38,138 - root - INFO - Starting Semantic Segmentation Dataset to VOC Convert.
loading annotations into memory…
2023-03-07 18:33:38,138 - root - INFO - Conversion failed with following error: [Errno 2] No such file or directory: ‘./unet/tao_isbi/data/isbi/images/trainval.json’.
Telemetry data couldn’t be sent, but the command ran successfully.
[WARNING]: <urlopen error [Errno -5] No address associated with hostname>
Execution status: PASS
2023-03-07 11:33:39,209 [INFO] tlt.components.docker_handler.docker_handler: Stopping container.
The path to the json file is correct, but it is not reading it. I figured it is because of the docker mapping, because when I run !cat ~/.tao_mounts.json
a few cells above, I only get “user”: “0:0” in “DockerOptions”, which I think is not supposed to happen. So I guess my env variables have been wrongly defined, but I do not understand the file hierarchy. Can I get some help with this please? I attach the env variable definition cell and the mount output here:
First cell:
# Setting up env variables for cleaner command line commands.
import os
%set_env KEY=nvidia_tlt
%set_env GPU_INDEX=0
%env NUM_GPUS=1
# %set_env USER_EXPERIMENT_DIR=/workspace/tao-experiments/unet
# %set_env DATA_DOWNLOAD_DIR=/workspace/tao-experiments/data
# %set_env USER_EXPERIMENT_DIR=/unet/tao_isbi/experiments/unet
# %set_env DATA_DOWNLOAD_DIR=/unet/tao_isbi/experiments/data
%set_env USER_EXPERIMENT_DIR=/home/dari/PycharmProjects/conda/tao-experiments/unet
%set_env DATA_DOWNLOAD_DIR=/home/dari/PycharmProjects/conda/tao-experiments/data
# Set this path if you don't run the notebook from the samples directory.
# %env NOTEBOOK_ROOT=~/tao-samples/unet
# %env NOTEBOOK_ROOT= ./unet/tao_isbi
# Please define this local project directory that needs to be mapped to the TAO docker session.
# The dataset expected to be present in $LOCAL_PROJECT_DIR/data, while the results for the steps
# in this notebook will be stored at $LOCAL_PROJECT_DIR/unet
# !PLEASE MAKE SURE TO UPDATE THIS PATH!.
# os.environ["LOCAL_PROJECT_DIR"] = './unet/tao_isbi'
os.environ["LOCAL_PROJECT_DIR"] = '/home/dari/PycharmProjects/conda/tao-experiments'
# !PLEASE MAKE SURE TO UPDATE THIS PATH!.
# Point to the 'deps' folder in samples from where you are launching notebook inside unet folder.
# %env PROJECT_DIR=/workspace/iva/ngc-collaterals/cv/samples
%env PROJECT_DIR=/home/dari/PycharmProjects/conda/getting_started_v4.0.0/notebooks/tao_launcher_starter_kit
os.environ["LOCAL_DATA_DIR"] = os.path.join(
os.getenv("LOCAL_PROJECT_DIR", os.getcwd()),
"data"
)
os.environ["LOCAL_EXPERIMENT_DIR"] = os.path.join(
os.getenv("LOCAL_PROJECT_DIR", os.getcwd()),
"unet"
)
# The sample spec files are present in the same path as the downloaded samples.
os.environ["LOCAL_SPECS_DIR"] = os.path.join(
os.getenv("NOTEBOOK_ROOT", os.getcwd()),
"specs"
)
%set_env SPECS_DIR=/home/dari/PycharmProjects/conda/tao-experiments/unet/specs
!ls -l $LOCAL_DATA_DIR
!ls -rlt $LOCAL_SPECS_DIR
Output:
env: KEY=nvidia_tlt
env: GPU_INDEX=0
env: NUM_GPUS=1
env: USER_EXPERIMENT_DIR=/home/dari/PycharmProjects/conda/tao-experiments/unet
env: DATA_DOWNLOAD_DIR=/home/dari/PycharmProjects/conda/tao-experiments/data
env: PROJECT_DIR=/home/dari/PycharmProjects/conda/getting_started_v4.0.0/notebooks/tao_launcher_starter_kit
env: SPECS_DIR=/home/dari/PycharmProjects/conda/tao-experiments/unet/specs
total 0
total 12
-rw-rw-r-- 1 dari dari 1394 Dec 14 20:37 unet_train_resnet_unet_isbi.txt
-rw-rw-r-- 1 dari dari 1255 Dec 14 20:37 unet_train_resnet_unet_isbi_retrain.txt
-rw-rw-r-- 1 dari dari 1274 Dec 14 20:37 unet_train_resnet_unet_isbi_retrain_qat.txt
Mount cell:
# Mapping up the local directories to the TAO docker.
import json
mounts_file = os.path.expanduser("~/.tao_mounts.json")
# Define the dictionary with the mapped drives
drive_map = {
"Mounts": [
# Mapping the data directory
{
"source": os.environ["LOCAL_PROJECT_DIR"],
"destination": "/workspace/tao-experiments"
# "destination": "/unet/tao_isbi/"
},
# Mapping the specs directory.
{
"source": os.environ["LOCAL_SPECS_DIR"],
"destination": os.environ["SPECS_DIR"]
},
],
"DockerOptions": {
# preserving the same permissions with the docker as in host machine.
"user": "{}:{}".format(os.getuid(), os.getgid())
}
}
# Writing the mounts file.
with open(mounts_file, "w") as mfile:
json.dump(drive_map, mfile, indent=4)
Output with !cat ~/.tao_mounts.json
:
{
“Mounts”: [
{
“source”: “/home/dari/PycharmProjects/conda/tao-experiments”,
“destination”: “/workspace/tao-experiments”
},
{
“source”: “/home/dari/PycharmProjects/conda/getting_started_v4.0.0/notebooks/tao_launcher_starter_kit/unet/tao_isbi/specs”,
“destination”: “/home/dari/PycharmProjects/conda/tao-experiments/unet/specs”
}
],
“DockerOptions”: {
“user”: “0:0”
}
}
Thank you in advance