Running own model in dnn framwework

Please provide the following info (tick the boxes after creating this topic):
Software Version
DRIVE OS 6.0.10.0
[1] DRIVE OS 6.0.8.1
DRIVE OS 6.0.6
DRIVE OS 6.0.5
DRIVE OS 6.0.4 (rev. 1)
DRIVE OS 6.0.4 SDK
other

Target Operating System
[1] Linux
QNX
other

Hardware Platform
DRIVE AGX Orin Developer Kit (940-63710-0010-300)
DRIVE AGX Orin Developer Kit (940-63710-0010-200)
DRIVE AGX Orin Developer Kit (940-63710-0010-100)
DRIVE AGX Orin Developer Kit (940-63710-0010-D00)
DRIVE AGX Orin Developer Kit (940-63710-0010-C00)
[1] DRIVE AGX Orin Developer Kit (not sure its number)
other

SDK Manager Version
2.1.0
[1]other

Host Machine Version
native Ubuntu Linux 20.04 Host installed with SDK Manager
native Ubuntu Linux 20.04 Host installed with DRIVE OS Docker Containers
native Ubuntu Linux 18.04 Host installed with DRIVE OS Docker Containers
other

Issue Description
Hi team, I have few queries
I have changed a mnist.pt file to mnist.engine file .

  1. I want to develop an application using C++ API of tensorRT using that I want to give input to mnist.engine and get inference

  2. Or you can suggest me in Python for same thing.

Please help!

Dear @akshay.tupkar ,
Could you check if /usr/src/tensorrt/samples/sampleOnnxMNIST and /usr/src/tensorrt/samples/python/network_api_pytorch_mnist on docker container helps?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.