Inference for Custom Classification Model

Please provide the following info (check/uncheck the boxes after clicking “+ Create Topic”):
Software Version
DRIVE OS Linux 5.2.0
[] DRIVE OS Linux 5.2.0 and DriveWorks 3.5
[
] NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version
other

Target Operating System
Linux
QNX
other

Hardware Platform
NVIDIA DRIVE™ AGX Xavier DevKit (E3550)
NVIDIA DRIVE™ AGX Pegasus DevKit (E3550)
other

SDK Manager Version
1.4.1.7402
other

Host Machine Version
[*] native Ubuntu 18.04
other
Hi,

I have doubts regarding inference on the custom classification model.
I have trained the custom model and converted it into tensorrt model on the host pc.
2. How to do inference in CPP for the custom model using drive works API?
3. Are there any examples for the above?

Thank you,
Daniel.

Dear @daniel.vadranapu1,
You need to generate DW supported TensorRT model using tensroRT_Optimization tool. The generated TensorRT model can be loaded into DW and perform inference using DNN module APIs(DriveWorks SDK Reference: DNN) .
You can take a look at sample_dnn_tensor, sample_object_detector_tracker samples for more understanding.