SZ1
April 24, 2019, 8:54am
1
Linux distro and version: Ubuntu 18.04
GPU type: TITAN Xp
nvidia driver version: 410.78
CUDA version: 10.0
CUDNN version: 7.5.1
Python version: 3.6
Tensorflow version: 1.13.1
TensorRT version: 5.0.2.6
Hallo,
I am using tf-trt to do INT8 calibration with my U-Net model.
from tensorflow.contrib import tensorrt as tftrt
following the calibration sample from here:
https://developer.download.nvidia.com/devblogs/tftrt_sample.tar.xz?spm=a2c4e.11153940.blogcont579985.9.2c9030d0Z0Lock&file=tftrt_sample.tar.xz
ERROR Message:
/lib/python3.6/site-packages/tensorflow/contrib/tensorrt/python/trt_convert.py", line 428, in calib_graph_to_infer_graph
int(msg[0]))
tensorflow.python.framework.errors_impl. FailedPreconditionError: TensorRT is not enabled!
When I print the
compiled_version = get_linked_tensorrt_version()
loaded_version = get_loaded_tensorrt_version()
from ~/anaconda3/envs/tf_gpu/lib/python3.6/site-packages/tensorflow/contrib/tensorrt/python/trt_convert.py
I got:
compiled_version (0, 0, 0)
loaded_version (0, 0, 0)
could you please tell me how to enable TensorRT?
Thanks,
SZ
SZ1
April 24, 2019, 9:28am
2
NVESJ
April 24, 2019, 10:02pm
3
Hello,
You are using a non-gpu TF or did not compile with TensorRT. Make sure you enable TensorRT when you build your TensorFlow.
Thanks.
SZ1
April 25, 2019, 12:48pm
4
Hello,
I am using tensorflow-gpu, could you please elaborate how to “enable TensorRT when I build TensorFlow” or how to “compile with TensorRT”, isn’t importing tf-trt(or alongwith tensorrt) enough?
PyCuda is also installed, if this is relevant.
thanks,
SZ
NVESJ
April 25, 2019, 8:26pm
5
Hello,
If you build your TensorFlow from source, enable TensorRT in your configure. See: [url]从源代码构建 | TensorFlow
Click on the “View sample configuration session”. You can see there is an option to enable TensorRT.
If you’re using pip install for tensorflow-gpu, you should have TF-TRT out of the box.
See: [url]https://docs.nvidia.com/deeplearning/dgx/tf-trt-user-guide/index.html#prereqs[/url]
Thanks.
NEVSJ
SZ1
April 26, 2019, 8:49am
6
hello,
I installed through conda:
conda install -c anaconda tensorflow-gpu
and I think tf-trt also just come with it, as I can:
from tensorflow.contrib import tensorrt as tftrt
Is there configuration that I should set in the Conda framework?
thanks,
SZ
NVESJ
April 26, 2019, 9:30pm
7
Hello,
Try to repro the problem by installing tensorflow-gpu with conda and run the sample code
[url]https://developer.download.nvidia.com/devblogs/tftrt_sample.tar.xz?spm=a2c4e.11153940.blogcont579985.9.2c9030d0Z0Lock&file=tftrt_sample.tar.xz[/url] .
Works fine for me.
Are you sure the tftrt that you import is referenced to the one you got from conda install tensorflow-gpu?
Also, to remove all the dependencies issues, we recommend you use our NGC containers.
SZ1
May 24, 2019, 10:13am
8
yes the tftrt that I import is referenced to the one from conda installed tensorflow-gpu.
I didn’t resolve this tftrt problem, but TensorRT works for me. Thanks!
the same problem, how to solve it?