How to "install" tensorRT when build from source


I need to build tensorRT with custom plugins. I followed steps described in GitHub - NVIDIA/TensorRT: TensorRT is a C++ library for high performance inference on NVIDIA GPUs and deep learning accelerators.

After I run the last command in the instruction:
make -j$(nproc)
In out folder I have the next files:  sample_char_rnn         sample_mlp            sample_onnx_mnist                sample_uff_maskRCNN      libnvinfer_plugin_static.a  sample_dynamic_reshape  sample_mnist          sample_onnx_mnist_coord_conv_ac  sample_uff_mnist          sample_fasterRCNN       sample_mnist_api      sample_plugin                    sample_uff_plugin_v2_ext
libnvcaffeparser_static.a        sample_googlenet        sample_movielens      sample_reformat_free_io          sample_uff_ssd    sample_int8             sample_movielens_mps  sample_ssd                       trtexec     sample_algorithm_selector   sample_int8_api         sample_nmt            sample_uff_fasterRCNN

And I do not know what to do next with it.

Hi , UFF and Caffe Parser have been deprecated from TensorRT 7 onwards, hence request you to try ONNX parser.
Please check the below link for the same.


May be my question was unclear, sorry for that. I mean that I can install tensorrt for python following the official instructions.

sudo apt-get install libnvinfer8=${version} libnvonnxparsers8=${version} libnvparsers8=${version} libnvinfer-plugin8=${version} libnvinfer-dev=${version} libnvonnxparsers-dev=${version} libnvparsers-dev=${version} libnvinfer-plugin-dev=${version} python3-libnvinfer=${version}
sudo apt-mark hold libnvinfer8 libnvonnxparsers8 libnvparsers8 libnvinfer-plugin8 libnvinfer-dev libnvonnxparsers-dev libnvparsers-dev libnvinfer-plugin-dev python3-libnvinfer

But I want to have opportunity to work in python with tensorrt with my custom plugin.
So my question how to do this?