TensorRT 4 not found for user

But it probably works for root. But I have missing dependencies for root.
Is there any normal way to fix it? I mean not installing all env for root?

user@ML-UBUNTU:~/packages-source/FastMaskRCNN$ python
Python 2.7.14 |Anaconda, Inc.| (default, Dec  7 2017, 17:05:42)
[GCC 7.2.0] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorrt as trt
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: No module named tensorrt
>>>
user@ML-UBUNTU:~/packages-source/FastMaskRCNN$ sudo python
[sudo] password for user:
Python 2.7.12 (default, Dec  4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorrt as trt
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/dist-packages/tensorrt/__init__.py", line 77, in <module>
    from tensorrt import infer, parsers, utils, lite, plugins
  File "/usr/lib/python2.7/dist-packages/tensorrt/infer/__init__.py", line 54, in <module>
    from ._infer_enums import *
  File "/usr/lib/python2.7/dist-packages/tensorrt/infer/_infer_enums.py", line 55, in <module>
    from enum import IntEnum
ImportError: No module named enum

Update - Here is solution to this https://devtalk.nvidia.com/default/topic/1028460/jetson-tx2/error-importing-tensorrt/2

“Unfortunately, you can’t use anaconda with TensorRT but you can still have anaconda on your system. To make tensorRT work I think it’s better to figure out what version of Python the tensorRT has been installed under and then make that version your default version and install other libraries under that version (tensorflow, …). You don’t have control where to install tensorRT but you’ve control on everything else.”

Update and solution:
Configuring path helps to run TensorRT wihtout root:
Check that you have all of this variables

export PATH=/usr/local/cuda-9.0/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/lib:/usr/local/cuda-9.0/lib64:/usr/local/cuda/lib64:/usr/local/cuda-9.0/extras/CUPTI/lib64/:$LD_LIBRARY_PATH

# added by Anaconda2 installer
export PATH="/home/$USER/anaconda2/bin:$PATH"

export PYTHONPATH=/usr/lib/python2.7/dist-packages/:/usr/local/lib/python2.7/site-packages/:$PYTHONPATH
export CUDA_HOME=/usr/local/cuda-9.0

We created a new “Deep Learning Training and Inference” section in Devtalk to improve the experience for deep learning and accelerated computing, and HPC users:
https://devtalk.nvidia.com/default/board/301/deep-learning-training-and-inference-/

We are moving active deep learning threads to the new section.

URLs for topics will not change with the re-categorization. So your bookmarks and links will continue to work as earlier.

-Siddharth