Thanks so much for replying to my questions so far. Your assistance is greatly appreciated.
In transfer learning toolkit (TLT), I am trying to use the tlt-converter to convert a .etlt file to a trt file. This is the command and the error I get. How do I solve this problem? Please answer quickly since our project is stopped because of this problem: (pictur e is attached)
sudo ./tlt-converter /opt/nvidia/deepstream/deepstream-5.0/samples/export_to_tx2/final_model.etlt -k NTI3ZTQ1azE0Yjc0bWFmcW81cHRtaXA1OXE6ZDdjNDlkOWYtZjgxMS00ZTI2LTkxMWYtMTAzYmI5ODljYzNj -c /opt/nvidia/deepstream/deepstream-5.0/samples/export_to_tx2/final_model_int8_cache.bin -o predictions/Softmax -d 3,224,224 -i nchw -m 64 -t int8 -e /opt/nvidia/deepstream/deepstream-5.0/samples/export_to_tx2/out_classification.trt -b 4
./tlt-converter: error while loading shared libraries: libnvinfer.so.5: cannot open shared object file: No such file or directory
Another question is that:
I tried to use a .etlt file exported from a classification model and tried to use option 1 to integrate it into deepsteam. I don’t know how to make configuration files for a .etlt classification model using option 1. Can you please send me sample configuration files for a classification model (not detection) for integration into deepstream using option 1?
I am confused about how to address the required files in the configuration files.