Xavier NX inference speed

I’m using a Nvidia jetson Xavier NX Development Kit for image classification, with model Resnet-18. But the frame rate for classification is slower than I expected.
The program basically uses the code from dusty https://github.com/dusty-nv/jetson-inference/blob/master/docs/imagenet-example-python-2.md
My image size is 224X224X3. To test the inference frame per second, I just load an image and model. run the inference 2000 time to calculate the FPS.

net = jetson.inference.imageNet(“ResNet-50”,[network1])
img= jetson.utils.loadImage(image_0.jpg)

start = time.time()
for im in range(2000):
class_idx, confidence = net.Classify(img)
end= time.time()

the resulting frame rate is ~250 FPS for Resnet-18 and ~160 FPS for Resnet-50. This is a lot slower than 824 for FPS ResNet-50.

using the benchmark program. I did get 824 FPS.

any other parameters need to be changed to increase the speed?

software:
Jetpack UNKNOWN (L4T 32.5.2)
CUDA 10.2.89
TensorRT: 7.1.3.0

Did you boost the system by nvpmodel and jetson_clocks?

sudo nvpmodel -m 2 
sudo jeston_clocks

Hi,

You can find the detailed configure used in the Jetson Benchmarks sample below:

The ResNet-50 model is running by GPU with 2DLAs in the multi-batch mode.

ModelName       	FrameWork	Devices	BatchSizeGPU	BatchSizeDLA	WS_GPU	WS_DLA	  ...
ResNet50_224x224	caffe	    3	    4	            2	            2048    1024     ...

Thanks.

Thanks. kayccc. I tried the two command, the FPS is the same about 250 FPS for ResNet-18. I also update the Xavier NX Development Kit to Jetpack 4.6.

Thanks for the help AastaLLL.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.