I can really use some help.
To my understanding, I can’t use the:
- “Jetson-inference” if I’ve created my own model or used a detection model different from DetectNet.
- python API for creating TensorRT engine from a uff model.
So, the only option I can think about is creating the engine using c++ API.
Is there any code available?
Please checkout https://github.com/dusty-nv/jetson-inference. It has sample C++ detectnet implementation for jetson.
I’m sorry if I was not clear enough but I meant I don’t want to use DetectNet model. The problem is that only DetectNet implementation is available for detection model.
I made my own model written in keras with tensorflow backend, made an uff file from this network and now I would like to create an engine from the uff file.
Sorry about the misunderstanding. I think you are looking for examples creating TRT engine from UFF using C++?
Please take a look at ~/samples/sampleUffMNIST/sampleUffMNIST.cpp (comes with your TRT installation).
For a more general sample of inference with TensorRT C++ API, see this (some parts are outdated, but still serves as good reference): http://github.com/dusty-nv/jetson-inference