tiny-tensorrt: a simple, efficient, easy-to-use TensorRT wrapper for cnn,sopport c++ and python


a simple, efficient, easy-to-use nvidia TensorRT wrapper for cnn with c++ and python api,support caffe, uff and onnx format models. you will be able use tiny-tensorrt deploy your model with few lines of code!

// create engine
trt.CreateEngine(prototxt,caffemodel,engingefile,outputBlob,calibratorData,maxBatchSize,runMode);
// transfer you input data to tensorrt engine
trt.DataTransfer(input,0,True);
// inference!!!
trt.Forward();
//  retrieve network outpu
trt.DataTransfer(output, outputIndex, False) // you can get outputIndex in CreateEngine phase

Features
Custom plugin tutorial and well_commented sample! —2019-12-11 firefirefire
Custom onnx model output node —2019.10.18
Upgrade with TensorRT 6.0.1.5 — 2019.9.29
Support onnx,caffe and tensorflow model
Support more model and layer --working on
PReLU and up-sample plugin
Engine serialization and deserialization
INT8 support for caffe model
Python api support
Set device

it an opensource project, code style is good and easy to understand and intergrate, feel free to ask me any quesion or advise, I want to make it better. thanks!!!

here is the project’s address: GitHub - zerollzeng/tiny-tensorrt: Deploy your model with TensorRT quickly. 快速使用TensorRT来部署模型

We appreciate you for sharing your work with the TRT communities.

Thanks