RUN tlt detectnet-v2 WITH test driver developpement

Hello,

Please I am trying to generate the tfrecord using this command tlt detectnet_v2 dataset_convert -d /workspace/tlt-experiments/config/kitti_trainval.txt -o / workspace / tlt-experiments / data / tfrecords / kittiTrain /
in a python script using the development technique (TDD, Test Driver Development) but I get this error the input device is not a TTY 2021-04-21 13: 21: 14,980 [INFO] tlt.components.docker_handler.docker_handler: Stopping container.

and when I checked in the tfrecords file I can’t find anything!
Knowing that I managed to generate them if I run a python script without using TDD,
What more configuration needs to be done to get the expected results ? please

THANK YOU

Can you share more details about TDD? Is it a specific docker?

For the error you mentioned, please double check the path/env_variable.

TDD is a development technique that recommends writing tests before software source code.

For the error I checked the path well but still without success.

Under TDD environment, can you login the TLT 3.0 docker and then run the command detectnet_v2 dataset_convert ... ?

I tried to do

os.system (“/ home / sylia / .local / bin / tlt detectnet_v2 dataset_convert -d /workspace/tlt-experiments/config/kitti_trainval.txt -o / workspace / tlt-experiments / data / tfrecords / kittiTrain /”)

in my Python script (TDD), but it gives me this error

the input device is not a TTY 2021-04-22 15: 49: 03,926 [INFO] tlt.components.docker_handler.docker_handler: Stopping container

I checked the path well, but without success
when I do os.system (" tlt --help) I can see the output of tlt but with tlt detectnet_v2 --help always the same error

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Under TDD environment, can you login the TLT 3.0 docker?

@sylia
Do you still meet the error?

yes

Can you share the full log about

  • how did you set up TDD environment
  • how to install tlt launcher under TDD environment
  • how to run tlt command in TDD environment

For the TDD it is a test driver development in python under vscode
I have a local project I created the .tlt_mounts.json, get the image with

docker pull nvcr.io/nvidia/tlt-streamanalytics:v3.0-dp-py3

then run this command under shell

docker run container -it -v /var/run/docker.sock:/var/run/docker.sock -v ~ / antt /: / workspace / tlt-experiments nvcr.io/nvidia/tlt-streamanalytics: v3.0-dp-py3 / bin / bash

installation of tlt laucher was done in shell
to run tlt in TDD I do

os.system (/home/sylia/.local/bin/tlt --help)

, this last function I manage to get the output of tlt,
But the problem when I run

os.system ( /home/sylia/.local/bin/tlt detectnet_v2 --help)

I receive the error

the input device is not a TTY
2021-05-07 14: 50: 14,602 [INFO] tlt.components.docker_handler.docker_handler: Stopping container.
under vscode I can see the docker and the current container

See below log, I cannot reproduce error.

(venv_3.0) morganh@dl:~/venv_3.0$ python
Python 3.6.9 (default, Jan 26 2021, 15:33:00)
[GCC 8.4.0] on linux
Type “help”, “copyright”, “credits” or “license” for more information.
>>> import os

>>> os.system(“/home/morganh/venv_3.0/bin/tlt --help”)
usage: tlt [-h]
{list,stop,info,augment,classification,detectnet_v2,dssd,emotionnet,faster_rcnn,fpenet,gazenet,gesturenet,heartratenet,intent_slot_classification,lprnet,mask_rcnn,punctuation_and_capitalization,question_answering,retinanet,speech_to_text,ssd,text_classification,tlt-converter,token_classification,unet,yolo_v3,yolo_v4}

Launcher for TLT

optional arguments:
-h, --help show this help message and exit

tasks:
{list,stop,info,augment,classification,detectnet_v2,dssd,emotionnet,faster_rcnn,fpenet,gazenet,gesturenet,heartratenet,intent_slot_classification,lprnet,mask_rcnn,punctuation_and_capitalization,question_answering,retinanet,speech_to_text,ssd,text_classification,tlt-converter,token_classification,unet,yolo_v3,yolo_v4}
0
>>> os.system(“/home/morganh/venv_3.0/bin/tlt detectnet_v2 --help”)
2021-05-09 16:01:00,697 [WARNING] tlt.components.docker_handler.docker_handler:
Docker will run the commands as root. If you would like to retain your
local host permissions, please add the “user”:“UID:GID” in the
DockerOptions portion of the ~/.tlt_mounts.json file. You can obtain your
users UID and GID by using the “id -u” and “id -g” commands on the
terminal.
Using TensorFlow backend.
usage: detectnet_v2 [-h] [–gpus GPUS] [–gpu_index GPU_INDEX [GPU_INDEX …]]
[–use_amp] [–log_file LOG_FILE]
{calibration_tensorfile,dataset_convert,evaluate,export,inference,prune,train}

Transfer Learning Toolkit

optional arguments:
-h, --help show this help message and exit
–gpus GPUS The number of GPUs to be used for the job.
–gpu_index GPU_INDEX [GPU_INDEX …]
The indices of the GPU’s to be used.
–use_amp Flag to enable Auto Mixed Precision.
–log_file LOG_FILE Path to the output log file.

tasks:
{calibration_tensorfile,dataset_convert,evaluate,export,inference,prune,train}
2021-05-09 16:01:09,098 [INFO] tlt.components.docker_handler.docker_handler: Stopping container.
0