Source code for TensorRT on Jetson

Hi,

We have problems with caffe parser (PReLU layer) in TensorRT. This bug was submitted on parsePReLU function in caffeParser/opeParsers seems to have a bug (does not work) · Issue #355 · NVIDIA/TensorRT · GitHub. I have also submitted a merge request which fixes the problem. I can build a patched version of TensorRT on PC based system (Ubuntu 18.04) and use libnvcaffeparser.so together with official distribution of TensorRT 7.1.3. (build 4)

However it seems the source code of TensorRT for Jetson TX2 (JetPack 4.5) is not available. The TensorRT branch 7.1 on github is not compatible with one distributed in JetPack 4.5. The header files differ, which breaks interface to .so libraries.

Is it possible to make version of TensorRT used in JetPack 4.5 available on github (same as for TensorRT released for PC)?

Thanks
Alexey

Hi,

Please make sure you have checkout the release/7.1 branch.

$ git clone -b release/7.1 https://github.com/NVIDIA/TensorRT.git

We have compared the header file between GitHub and /usr/include/aarch64-linux-gnu/NvCaffeParser.h in JetPack4.5.
There is only some minor difference in the comment, so it should be compatible.

Would you mind checking it again?
Or could you share the detailed error (merge conflict?) with us?

Thanks.

Hi,

The problem is in building NvCaffeParser against NvInfer library from JetPack 4.5. There are headers in “TensorRT.git” (NvInfer.h in particular) which got picked up during the build of parser library. They differ to ones supplied in JetPack. As result calls to some functions has no effect (for example creating optimisation profiles). I would expect such or even worse behaviour. To rectify the problem a bit I have copied JetPack’s headers (except NvCaffeParser) into inlude directory in clonned copy of TensorRT.git, then rebuild. The resulting libnvcaffeparser.so seems to work then.

However there is another issue of switching optimisation profiles in execution context happens now. I will log it in a separate ticket.

Thanks

Thanks for the sharing the status.
Let’s track the following in the other topic.