Using tao toolkit and the detectnetv2 notebook, I am training the trafficcamnet unpruned model on a my own dataset of car images. The trafficcamnet unpruned model is a .tlt model. At each model checkpoint, I am producing a .hdf5 file and a .ckzip file. From reading through tao documentation, I would expect to be producing .tlt files at each model checkpoint. This is critical because I would like to produce a .etlt model which I can feed to the tao-converter after I am done training. How can I produce .tlt files from training?
The tlt file is actually encryption of hdf5 file.
The etlt file is actually encryption of onnx file.
Before TAO 5.0, tlt file is generated during training. The etlt file is generated during exporting.
Since TAO 5.0, hdf5 file is generated during training. The onnx file is generated during exporting.
Also, my end goal is to run a Triton server with my generated tensorrt engine. Ive created a tensorrt engine with tao deploy and attempted to run it on a Triton server, but received this error: Error Code 1: Serialization (Serialization assertion safeVersionRead == safeSerializationVersion failed.Version tag does not match. Note: Current Version: 43, Serialized Engine Version: 0)
For Error Code 1: Serialization (Serialization assertion safeVersionRead == safeSerializationVersion failed.Version tag does not match. Note: Current Version: 43, Serialized Engine Version: 0), it is a common error when TensorRT version is not the same between building engine and running engine.
If you build engine with tao deploy, it will use docker nvcr.io/nvidia/tao/tao-toolkit:5.0.0-deploy. The tensort version should be different from the version in nvcr.io/nvidia/tritonserver:21.10-py3.
I am seeing this error:
Traceback (most recent call last):
File “decode_eff.py”, line 65, in
main()
File “decode_eff.py”, line 60, in main
decode_eff(args.model, args.output, args.key)
File “decode_eff.py”, line 21, in decode_eff
eff_art = Archive.restore_artifact(
File “”, line 519, in restore_artifact
File “/usr/lib/python3.8/tarfile.py”, line 1621, in open
return func(name, filemode, fileobj, **kwargs)
File “/usr/lib/python3.8/tarfile.py”, line 1678, in gzopen
raise ReadError(“not a gzip file”)
tarfile.ReadError: not a gzip file
What is the correct input? Is this the right code to use for decoding an etlt model to onnx?
I’m closing this topic due to there is no update from you for a period, assuming this issue was resolved.
If still need the support, please open a new topic. Thanks