Please provide the following information when requesting support.
• Hardware (T4/V100/Xavier/Nano/etc)
• Network Type (Detectnet_v2/Faster_rcnn/Yolo_v4/LPRnet/Mask_rcnn/Classification/etc) Classification_tf2
• TLT Version (Please run “tlt info --verbose” and share “docker_tag” here) v5.0.0
• Training spec file(If have, please share here)
• How to reproduce the issue ? (This is for errors. Please share the command line and the detailed log here.)
I would like to access the model file before it is internally converted to .tlt file, or try to decode the .tlt file to .hdf5 or .pb. Can you tell me how to do that? Exporting the .tlt model to .etlt and then to ONNX only to convert it back to tflite for deployment is causing a lot of non-native ops to be introduced and sometimes even failure in conversion. I see that the code for this exists, but I am not sure how to use it.