Hi @nasserha ,
Yes, it can only infer using the hdf5 and the trt engine.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Inference using ONNX, or .hdf5 tao model differes | 2 | 567 | September 26, 2023 | |
| TAO LPRNET inference | 10 | 48 | September 3, 2024 | |
| Inference with tensorrt engine file has different results compared with trained hdf5 model | 9 | 242 | July 8, 2024 | |
| Errors while reading ONNX file produced by TAO 5 | 11 | 1922 | September 6, 2023 | |
| Direct export/inference TAO CV models using TAO API | 6 | 508 | February 20, 2024 | |
| TLT to onnx at TAO5.0 | 4 | 631 | October 5, 2023 | |
| How to use .hdf5 file | 2 | 409 | August 6, 2023 | |
| Use TensorRT model with TAO Toolkit inference | 5 | 1008 | February 9, 2022 | |
| Onnx python post-processing vs. TAO train post processing | 6 | 646 | October 24, 2023 | |
| Inference with TensorRT is different that inference with HDF5 | 16 | 522 | March 25, 2024 |