No module named tensorrt

I am migrating from JetPack 3.2 and TensorRT3 to JetPack 3.3 and TensorRT4.

The URL Generating TensorRT Engines from TensorFlow — TensorRT 4.0 documentation, “Generating TensorRT Engines from Tensorflow” example on how to convert a tensorflow model to be usable by TensorRT4 does not work from me.

I am running on a TX2 with JetPack 3.3. The wheel for Tensorflow 1.9 was installed.
Python reports the following:

$ python
Python 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type “help”, “copyright”, “credits” or “license” for more information.

import tensorflow as tf
print (tf.GIT_VERSION, tf.version)
(‘18.08-stage-0-ge269373’, ‘1.9.0’)
import tensorrt as trt
Traceback (most recent call last):
File “”, line 1, in
ImportError: No module named tensorrt

What is the proper method for converting a frozen model to a plan?
Is that even the current terminology?

That is what I did for TensorRT3 with an inception-v3 model that was retrained with my images.
The results were very good.
I am told TensoRT4 is faster.
I would like to take advantage of that improvement.

Hello,

TensorRT python API doesn’t support Jetson platform:
http://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#tensorrtworkflow

Do you meet this error on a x86 Linux environment?

please follow this page to install TensorRT.
https://docs.nvidia.com/deeplearning/sdk/tensorrt-install-guide/index.html#installing

No, I am not reporting this error for a x86. As stated my environment is the TX2. The TX2 has the nvidia chip and the GPUs.

This appears to be a regression. Previously, in Jetpack 3.2 / TensorRT 3, there were python scripts in a tree headed by tf_to_trt_image_classification. One of them contained frozenToPlan in convert_plan.py. Is there a version of that tree for Jetpack 3.3 / TensorRT 4?

Should that tree work for Jetpack 3.3 /TensorRT 4?

Hi,

I think you are referring to this? https://github.com/NVIDIA-Jetson/tf_to_trt_image_classification

The repo is out of date (TRT3), but you can replace step 4 with downloading the TRT4 tar (TensorRT-4.0.1.6.Ubuntu-16.04.4.x86_64-gnu.cuda-9.2.cudnn7.1), then pip installing the TRT4 specific whl (tensorrt-4.0.1.6-cp35-cp35m-linux_x86_64.whl python 3.5 for example).

Let me be sure I understand your suggestion.
I should install the x86 wheel on the TX2.