Please provide the following information when requesting support.
• Hardware (T4/V100/Xavier/Nano/etc) : Any
• Network Type (Detectnet_v2/Faster_rcnn/Yolo_v4/LPRnet/Mask_rcnn/Classification/etc) : Any
• TLT Version (Please run “tlt info --verbose” and share “docker_tag” here) : “nvidia/tao/tao-toolkit-tf: v3.21.11-tf1.15.5-py3”
• Training spec file(If have, please share here) : None
• How to reproduce the issue ? (This is for errors. Please share the command line and the detailed log here.) : None
I have various etlt-models and serialized-engine-models (which was previously converted by tao-converter).
I have to know input-name and dimensions of etlt models because tao-converter requires -p option for dynamic batch models.
I also have to know input/output name & shapes of serialized-engine-models for triton configuration.
Is there any (regular) ways to get input/ouput name and dimensions info of etlt/serialized-engine models?
trtexec --loadEngine=a.engine --exportOutput=abc.json prints some info but
this solution can be applied only to serialized engine file
got dimensions sometimes include batchdim and sometimes not depending on models
Thank you very much.
I confirmed that I can get name, shape explicitBatch/ImplicitBatch infomations for serialized engine models. I would have liked to have json output if possible, but it is ok because it can be parsed.
I still have two questions.
Is there any ways for etlt models?
This is an unimportant issue for me, but the output dtype seems to be float32 for a INT8 model. Is it a expected behavior? The model was converted by trtexec -t int8 command.