Running sample TensorRT model Jetpack4.2.2 (Both Python and C++)

Q1:
I need to RUN the ResNet50_fp32.caffemodel from sample with a

  1. Python program (Already there)
    AND
  2. C++ program
    Is it possible ?

Q2:
Is there a TensorRT sample that can be called for inference from
both C++ and Python ? If possible can anyone share github URL ?

Hello @minidb7,
Please refer the below links which might be helpful to answers your questions.



https://docs.nvidia.com/deeplearning/tensorrt/archives/tensorrt-710-ea/sample-support-guide/index.html#introductory_parser_samples

Thanks!