Different classification result between DIGITS and jetson-inference

Hi, recently I trained a classifier using DIGITS (LeNet) to classify license plate characters (31 characters and numbers). I got good accuracy of 98% and 0.1 loss.

When I test the model I got different prediction between DIGITS and jetson-inference on my Jetson Nano.
0-X

I resized the images to 28x28 and to gray scale with openCV before testing.
With this image I got 67,6% for class: “2” and 31,91% for class “X” running on DIGITS, but jetson-inference example give me 81.108% of class: “X”.

I tried switching between FP16 and FP32, traning with no mean substraction but nothing changed.

What could be the problems here?
In the preprocessing phase before prediction, what does DIGITS do?

Thanks!

2 Likes

Plz attach model and test sample. I need some picture show result on digit web và resault from example provided by jetson-inferent too

1 Like

Hi,

A common issue comes from image pre-processing.
Please make sure the input image goes through the identical process as the training data in DIGITs.
For example, color format, input data type, mean subtraction.

If the issue goes on, please help to share a reproduce sample so we can check it for you.
Thanks.

2 Likes

Thank you for your response.

I follow the same processes for training and testing. I converted the image to gray scale and resized them to 15x30 with openCV.
Therefore my data for training shared the same color format and size before going into DIGITS. And I’m training with no mean (Subtract mean = None).

Here is the link to my model, some sample images and the output logs of jetson-inference example:

Please let me know if I am missing anything.

Thanks.

1 Like

Hi,

Thanks, we are going to reproduce this in our environment.
May I know which JetPack version do you use first?
Thanks.

1 Like

Thanks a lot, I ran: sudo apt-cache show nvidia-jetpack

Here is what I got

Package: nvidia-jetpack
Version: 4.3-b134
Architecture: arm64
Maintainer: NVIDIA Corporation
Installed-Size: 194

1 Like

Hi,

There is a new release last week (4.3-b186).
Would you mind to reproduce this issue in our latest version first?

Thanks.

Good morning,

I run the same test on my PC as well and the problem still persists. So I do not think jetpack version will solve my issue here.
I will upgrade to your latest jetpack and run it again. In the meantime could please reproduce this on your environment?
Because I use jetson-inference and DIGITS on my PC as well.

Thanks.

1 Like

Hi,

Have you checked the result from our latest JetPack version?

We try to check this issue but found the link only have model and test image.
Would you mind to share the source to generate the output based on your model with us?
If you are using jetson-inference, would you mind to share the detail command with us?

Thanks.

Thank you for your reply!

I burn your latest release to a new SD card and the result differences still remain.

I use jetson-inference example command, here it is:

imagenet-console --model=/path/to/caffe/model/file --prototxt=/path/to/deploy/prototxt/file --labels=/path/to/labels.txt --input_blob=data --output_blob=softmax /input/img/path /output/img/path

Thanks.

Hi,

May I know what kind of pre-process you have applied in the DIGITs pipeline.

Have you applied any subtract-mean or normalization steps before feeding image into inference?
It looks like there is no extra pre-process in the jetson-inference flow, and this may cause the difference.

Thanks.