Hi, I have a problem when I used the NVIDIA Jetson AGX Xavier to operation my deep learning model.
I have confirmed that all the necessary model can be operated by the NVIDIA Jetson AGX Xavier , but the GPU can not use when calling from java (only CPU use ), I do not know what kind of reason cause this situation.
TRT just support C++ and python APIs.
Could you please elaborate more on the use case and how TRT is used here?
So, when was once the closing time you monitored the CPU utilized with the aid of your Java software ? Perhaps ‘never’ ?. It is actually important to keep an eye on the CPU utilzied with the aid of your application. Your Application runs on a host (or hosts) that has one or more CPUs managed through the Operating System. The CPU aid is now not unlimited (even although some builders and Administrators wish that have been true).
Can you please share the script & model file to reproduce the issue?