Steps for TensorRT implementation on mobileNet-ssd with Jetson tx2

Hi, i’m new to deep learning field. but i have worked in mobileNet-ssd and created a custom model to detect person only(and its working successfully). So my caffemodel and deploy.prototxt is ready. I have a Jetson-tx2 board. I’ve installed TensorRT in it. But i dont know how to work with implementation of this mobileNet-ssd’s caffe model(or any caffe model, but im suppose to work only in mobileNet-ssd) with TensorRT. I’ve followed method for model creation. I’ve seen some posts on mobileNet-ssd here, but each of them starts from middle or with issue. I’m not getting a complete step-by-step procedure for the it.

Sorry that i’m asking like this, because i’m a fresher to this field and have been sitting for these for days.

Please help me…


I’d recommend starting in the jetson forum, specifically:

Thanks, thanks a lot. I have gone through it, and reached till here - . and installed DIGITS,now planning to train my custom dataset with DetectNet(by following the steps mentioned in link for label creation, etc). so am i following it right? My main aim is the MobileNet-ssd model in caffe running in TensorRT.


You are on the right path. Also, consult the Jetson forum if you have specific mobile implementation questions.

This is my code,

I hope this can help you.