Tensorflow automatically invokes tf.convert_to_tensor() to convert feed data into placeholder type. Does TensorRT maintain the same behavior ?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
TRT-TF Integration | 2 | 1001 | October 2, 2018 | |
Does tensor rt 5 automatically enable tensor core for int8 and fp16 mode? | 6 | 1803 | April 26, 2019 | |
Inputting and outputting a TensorRT engine with FP16 optimization in C++ | 4 | 2583 | October 12, 2021 | |
How to convert Tensorrt engine to uff or pb ? | 2 | 695 | February 13, 2020 | |
Compatiable with Tensorflow 2.0? | 0 | 617 | April 17, 2019 | |
TensorRT with Tensorflow models - what are the options? | 1 | 607 | April 15, 2019 | |
TensorRT inference time much faster than cuDNN | 5 | 1585 | February 22, 2022 | |
How TensorRT process data? | 0 | 315 | August 29, 2019 | |
Jetson Nano convert tensorflow model to tensorrt | 4 | 1061 | February 7, 2023 | |
Which type/format of data is needed for int8 calibration | 5 | 1011 | February 9, 2023 |