Exporting detectnet to INT8 using dataloader

Please provide the following information when requesting support.

• Hardware 3090
• Network Type Detectnet_v2
• TLT Version 3.22.05

In the TOA object detection (detectnet_v2) documentation I noticed the there are 3 ways to convert a model to INT8. I’ve been using option 1 (generate calibration tensorfile) but it will be depricated. The documentation states for option 3:

Option 3: Using the training data loader directly to load the training images for INT8 calibration. This option is now the recommended approach as it helps to generate multiple random samples. This also ensures two important aspects of the data during calibration:

Data pre-processing in the INT8 calibration step is the same as in the training process.

The data batches are sampled randomly across the entire training dataset, thereby improving the accuracy of the int8 model.

Calibration occurs as a one-step process with the data batches being generated on the fly.

Can you give me any direction on how to use option 3?

You can refer to below command mentioned in DetectNet_v2 — TAO Toolkit 3.22.05 documentation

tao detectnet_v2 export
    -m $USER_EXPERIMENT_DIR/detectnet_v2/model.tlt
    -o $USER_EXPERIMENT_DIR/detectnet_v2/model.int8.etlt
    -e $SPECS_DIR/detectnet_v2_kitti_retrain_spec.txt
    --key $KEY
    --cal_image_dir  $USER_EXPERIMENT_DIR/data/KITTI/val/image_2
    --data_type int8
    --batch_size 8
    --batches 10
    --cal_data_file $USER_EXPERIMENT_DIR/data/detectnet_v2/cal.tensorfile
    --cal_cache_file $USER_EXPERIMENT_DIR/data/detectnet_v2/cal.bin
    --engine_file $USER_EXPERIMENT_DIR/data/detectnet_v2/detection.trt

Thanx Morganh, will try!

I noticed the export specifications differ between a non-QAT and QAT export regarding batches and batchsize.

With non QAT the batch-specs are
–batches 10
–batch_size 4
–max_batch_size 4\

With QAT the batch-specs are
–batch_size 64
–max_batch_size 64\

can I use the command you stated for both non-QAT as well as QAT, or do I need to change batchsize?

Yes, you can.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.