Biased result while running classification on image using python script

Hi @Morganh,

I am testing one multi-task classification model trained on tao. I am getting good result (Not Biased) with deepstream application but when I am running inference using python script I am always getting biased result.

Please help me. where I have made mistake.
Below is my script.

import os
import time

import cv2
#import matplotlib.pyplot as plt
import numpy as np
import pycuda.autoinit
import pycuda.driver as cuda
import tensorrt as trt
from PIL import Image
import pdb
import codecs
import glob
import datetime
import shutil

input_shape = (3,224,224)


class HostDeviceMem(object):
    def __init__(self, host_mem, device_mem):
        self.host = host_mem
        self.device = device_mem

    def __str__(self):
        return "Host:\n" + str(self.host) + "\nDevice:\n" + str(self.device)

    def __repr__(self):
        return self.__str__()


def load_engine(trt_runtime, engine_path):
    with open(engine_path, "rb") as f:
        engine_data = f.read()
    engine = trt_runtime.deserialize_cuda_engine(engine_data)
    return engine

# Allocates all buffers required for an engine, i.e. host/device inputs/outputs.
# def allocate_buffers(engine, batch_size=-1):
def allocate_buffers(engine, batch_size=1):
    inputs = []
    outputs = []
    bindings = []
    stream = cuda.Stream()
    for binding in engine:
        # pdb.set_trace()
        size = trt.volume(engine.get_binding_shape(binding)) * batch_size
        dtype = trt.nptype(engine.get_binding_dtype(binding))
        # Allocate host and device buffers
        host_mem = cuda.pagelocked_empty(size, dtype)
        device_mem = cuda.mem_alloc(host_mem.nbytes)
        # Append the device buffer to device bindings.
        bindings.append(int(device_mem))
        # Append to the appropriate list.
        if engine.binding_is_input(binding):
            inputs.append(HostDeviceMem(host_mem, device_mem))
            # print(f"input: shape:{engine.get_binding_shape(binding)} dtype:{engine.get_binding_dtype(binding)}")
        else:
            outputs.append(HostDeviceMem(host_mem, device_mem))
            # print(f"output: shape:{engine.get_binding_shape(binding)} dtype:{engine.get_binding_dtype(binding)}")
    return inputs, outputs, bindings, stream



def do_inference(context, bindings, inputs, outputs, stream, batch_size=1):
    # Transfer input data to the GPU.
    [cuda.memcpy_htod_async(inp.device, inp.host, stream) for inp in inputs]
    # Run inference.
    context.execute_async(
        batch_size=batch_size, bindings=bindings, stream_handle=stream.handle
    )
    # Transfer predictions back from the GPU.
    [cuda.memcpy_dtoh_async(out.host, out.device, stream) for out in outputs]
    # Synchronize the stream
    stream.synchronize()
    # Return only the host outputs.
    return [out.host for out in outputs]

def model_loading(trt_engine_path):
    # TensorRT logger singleton
    os.environ["CUDA_VISIBLE_DEVICES"] = "1"
    TRT_LOGGER = trt.Logger(trt.Logger.WARNING)
    # trt_engine_path = "/opt/smarg/surveillance_gateway_prod/surveillance_ai_model/x86_64/Secondary_NumberPlateClassification/lpr_us_onnx_b16.engine"

    trt_runtime = trt.Runtime(TRT_LOGGER)
    # pdb.set_trace()
    trt_engine = load_engine(trt_runtime, trt_engine_path)
    # Execution context is needed for inference
    context = trt_engine.create_execution_context()
    # NPR input shape
    # input_shape = (3,48,96)
    context.set_binding_shape(0, input_shape)
    # This allocates memory for network inputs/outputs on both CPU and GPU
    inputs, outputs, bindings, stream = allocate_buffers(trt_engine)
    return inputs, outputs, bindings, stream, context

trt_engine_path = "/home/smarg/Documents/Pritam/AGE-GROUP-MODEL-ANALYSIS/DS-APP/AGE-CLASSIFICATION-MULTICLASS/Models/Age-Gender/age-gender-mcls_export_resnet18_epoch_044_int8.etlt_b4_gpu0_fp16.engine"
inputs, outputs, bindings, stream, context = model_loading(trt_engine_path)


# pdb.set_trace()
# image = [cv2.imread("/home/smarg/Downloads/Images/resized/img/IMG_20210719_160022_cropped_batch_code_image_imgGB3_BATO007_.jpg")]

# Run inference on folder 
image_folder_path = "/home/smarg/Documents/Pritam/AGE-GROUP-MODEL-ANALYSIS/INPUT-IMAGES/HYD-IMAGES-2/"
output_folder_path = "/home/smarg/Documents/Pritam/AGE-GROUP-MODEL-ANALYSIS/INPUT-IMAGES/OUTPUT-IMAGE/"
image_count = 0
start_time = datetime.datetime.now()
for image_path in glob.glob(image_folder_path + "*.jpg"):
    
    # print("Image name :",image_path)
    image = [cv2.imread(image_path)]

    image = np.array([(cv2.resize(img, ( 224,224 )))/ 255.0 for img in image], dtype=np.float32)
    image= image.transpose( 0 , 3 , 1 , 2 )

    np.copyto(inputs[0].host, image.ravel())

    output = do_inference(context, bindings=bindings, inputs=inputs, outputs=outputs, stream=stream)
    print("output : ",output)
    max_index_row_gender = np.argmax(output[0], axis=0)
    max_index_row_age = np.argmax(output[1], axis=0)

    print("gender ID : {} , age ID : {}".format(max_index_row_gender,max_index_row_age))

    image_count += 1
    

end_time = datetime.datetime.now()
total_time = end_time - start_time

print("Total image processed : {} Total Time : {} ".format(image_count,total_time))

Output :

output :  [array([0.49960953, 0.50039047], dtype=float32), array([0.41089946, 0.22001249, 0.18884175, 0.18024634], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49950877, 0.50049126], dtype=float32), array([0.40835795, 0.22062853, 0.19023134, 0.18078218], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957067, 0.50042933], dtype=float32), array([0.40769285, 0.2211882 , 0.19029601, 0.18082291], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49967593, 0.50032413], dtype=float32), array([0.40797126, 0.22099821, 0.19008666, 0.18094388], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995053, 0.5004947], dtype=float32), array([0.408998  , 0.22056976, 0.18986191, 0.18057038], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995604, 0.5004396], dtype=float32), array([0.40855896, 0.22072561, 0.19008332, 0.18063207], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49962616, 0.5003739 ], dtype=float32), array([0.40884602, 0.22074693, 0.1900715 , 0.18033548], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49959895, 0.50040096], dtype=float32), array([0.40919894, 0.220416  , 0.1899584 , 0.18042663], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958318, 0.5004169 ], dtype=float32), array([0.40868035, 0.22049114, 0.190077  , 0.1807515 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49951166, 0.5004883 ], dtype=float32), array([0.4120412 , 0.21939355, 0.1884586 , 0.1801067 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49965316, 0.5003469 ], dtype=float32), array([0.41244313, 0.21937549, 0.18839279, 0.17978866], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957895, 0.50042105], dtype=float32), array([0.4108822 , 0.21997873, 0.18907857, 0.1800605 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996056, 0.5003944], dtype=float32), array([0.41016117, 0.22018321, 0.18927687, 0.1803788 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961004, 0.50038993], dtype=float32), array([0.4109024 , 0.22006486, 0.18854018, 0.18049255], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49959737, 0.5004026 ], dtype=float32), array([0.4085397 , 0.22072737, 0.1898635 , 0.18086946], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49970236, 0.5002976 ], dtype=float32), array([0.40785733, 0.22051816, 0.18937816, 0.18224639], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957454, 0.5004255 ], dtype=float32), array([0.40947375, 0.22044067, 0.18974634, 0.18033925], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49971762, 0.50028235], dtype=float32), array([0.40846738, 0.22056414, 0.18996353, 0.18100493], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49962837, 0.5003717 ], dtype=float32), array([0.4092561 , 0.22047858, 0.189468  , 0.18079737], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956256, 0.50043744], dtype=float32), array([0.40772066, 0.22116102, 0.19036476, 0.18075357], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958768, 0.50041234], dtype=float32), array([0.41237476, 0.21946265, 0.18806109, 0.18010147], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49965215, 0.50034785], dtype=float32), array([0.40979192, 0.22027785, 0.18916121, 0.18076903], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49966413, 0.5003359 ], dtype=float32), array([0.40971673, 0.22019948, 0.18928427, 0.18079951], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956322, 0.5004367 ], dtype=float32), array([0.40807414, 0.22113046, 0.19004902, 0.18074633], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996033, 0.5003967], dtype=float32), array([0.40716678, 0.22120574, 0.1907194 , 0.18090804], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995459, 0.5004541], dtype=float32), array([0.4086155 , 0.22079527, 0.18985258, 0.18073663], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996546, 0.5003454], dtype=float32), array([0.40899226, 0.22048713, 0.18946923, 0.18105139], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49954152, 0.5004585 ], dtype=float32), array([0.4084932 , 0.22081818, 0.19026162, 0.18042696], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49955446, 0.5004456 ], dtype=float32), array([0.40922052, 0.22056195, 0.18980335, 0.18041418], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995486, 0.5004514], dtype=float32), array([0.40770662, 0.2209601 , 0.19005679, 0.18127643], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995646 , 0.50043535], dtype=float32), array([0.41097242, 0.22007923, 0.18923132, 0.17971703], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956563, 0.50043434], dtype=float32), array([0.40679446, 0.22152765, 0.190919  , 0.18075883], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957365, 0.5004264 ], dtype=float32), array([0.40859047, 0.22062038, 0.19028065, 0.18050857], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995463 , 0.50045365], dtype=float32), array([0.41240045, 0.21953811, 0.18930423, 0.17875718], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960852, 0.5003914 ], dtype=float32), array([0.41132417, 0.21958192, 0.18901317, 0.18008062], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49952012, 0.50047994], dtype=float32), array([0.40885416, 0.22062509, 0.19020636, 0.18031445], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49955678, 0.5004433 ], dtype=float32), array([0.40953082, 0.22070229, 0.18933779, 0.18042913], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49955875, 0.50044125], dtype=float32), array([0.409303  , 0.22042146, 0.18976226, 0.18051331], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49952978, 0.5004702 ], dtype=float32), array([0.40896273, 0.22050269, 0.19015782, 0.18037681], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958858, 0.50041145], dtype=float32), array([0.4083556 , 0.22100738, 0.18998782, 0.1806492 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49955964, 0.50044036], dtype=float32), array([0.4066048 , 0.22164063, 0.19093727, 0.18081744], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958214, 0.5004178 ], dtype=float32), array([0.40870684, 0.22075944, 0.18998963, 0.18054417], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957418, 0.5004259 ], dtype=float32), array([0.40771845, 0.2211086 , 0.19032006, 0.18085289], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49959174, 0.5004083 ], dtype=float32), array([0.40500188, 0.22214776, 0.19138916, 0.18146116], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958768, 0.5004123 ], dtype=float32), array([0.40940067, 0.22037283, 0.18926844, 0.18095814], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49964648, 0.50035346], dtype=float32), array([0.41404656, 0.21876629, 0.1873001 , 0.17988707], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49964383, 0.5003562 ], dtype=float32), array([0.40497217, 0.22204028, 0.19121255, 0.18177496], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958304, 0.50041693], dtype=float32), array([0.40952635, 0.22047792, 0.18940458, 0.18059123], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49971193, 0.50028807], dtype=float32), array([0.40852317, 0.2206008 , 0.18925534, 0.18162075], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958128, 0.5004187 ], dtype=float32), array([0.41131955, 0.21994092, 0.1892559 , 0.17948362], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995714, 0.5004286], dtype=float32), array([0.4107328 , 0.22014077, 0.18842708, 0.1806993 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.499654, 0.500346], dtype=float32), array([0.4120037 , 0.2190446 , 0.1874541 , 0.18149759], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49978405, 0.500216  ], dtype=float32), array([0.40783876, 0.22092651, 0.1893502 , 0.1818845 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49977005, 0.50023   ], dtype=float32), array([0.41040483, 0.22015765, 0.18835992, 0.18107766], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995579 , 0.50044215], dtype=float32), array([0.40952572, 0.22037514, 0.189445  , 0.18065406], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961987, 0.50038016], dtype=float32), array([0.41216925, 0.21949139, 0.18843701, 0.17990233], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995853, 0.5004147], dtype=float32), array([0.40733892, 0.22096737, 0.18999512, 0.18169864], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4994954, 0.5005046], dtype=float32), array([0.40698236, 0.22157785, 0.19069323, 0.18074657], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49947852, 0.5005215 ], dtype=float32), array([0.41192108, 0.21943034, 0.18843159, 0.18021703], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996272 , 0.50037277], dtype=float32), array([0.4075596 , 0.22106482, 0.1896536 , 0.18172193], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49955243, 0.5004476 ], dtype=float32), array([0.40969443, 0.22044055, 0.18965916, 0.18020591], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958286, 0.5004171 ], dtype=float32), array([0.4101175 , 0.22033283, 0.18923442, 0.18031526], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49954376, 0.5004563 ], dtype=float32), array([0.40727314, 0.22126704, 0.19073506, 0.18072478], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957597, 0.500424  ], dtype=float32), array([0.41097748, 0.21992284, 0.18902549, 0.18007417], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996834 , 0.50031656], dtype=float32), array([0.40914834, 0.22037096, 0.18910086, 0.1813798 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49951208, 0.5004879 ], dtype=float32), array([0.4090354 , 0.220781  , 0.19005786, 0.1801257 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49963963, 0.5003603 ], dtype=float32), array([0.41211608, 0.21913709, 0.18828838, 0.18045843], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961978, 0.5003803 ], dtype=float32), array([0.4102471 , 0.22014758, 0.18904907, 0.18055624], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49952406, 0.500476  ], dtype=float32), array([0.408578  , 0.22061183, 0.1902715 , 0.18053867], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49954566, 0.5004543 ], dtype=float32), array([0.40850988, 0.22077875, 0.1900523 , 0.18065904], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996542, 0.5003458], dtype=float32), array([0.4086627 , 0.22069465, 0.18966426, 0.18097843], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957192, 0.5004281 ], dtype=float32), array([0.40872014, 0.22082609, 0.19021484, 0.18023899], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956393, 0.500436  ], dtype=float32), array([0.4092138 , 0.22043788, 0.18967342, 0.18067482], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4997353 , 0.50026464], dtype=float32), array([0.40548337, 0.22178698, 0.19076324, 0.18196644], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995818, 0.5004182], dtype=float32), array([0.40956604, 0.2203683 , 0.18965091, 0.1804147 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960622, 0.50039375], dtype=float32), array([0.4116691 , 0.2195908 , 0.18883333, 0.17990671], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961254, 0.50038743], dtype=float32), array([0.40925252, 0.22052236, 0.18952163, 0.18070343], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957305, 0.50042695], dtype=float32), array([0.4120513 , 0.21955806, 0.18854935, 0.17984122], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956498, 0.500435  ], dtype=float32), array([0.4100747 , 0.2201594 , 0.1892678 , 0.18049808], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996847 , 0.50031525], dtype=float32), array([0.4107796 , 0.21975903, 0.18888718, 0.18057418], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49962693, 0.50037307], dtype=float32), array([0.409629  , 0.2208133 , 0.1891322 , 0.18042548], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995607, 0.5004393], dtype=float32), array([0.4076931 , 0.2210269 , 0.19034053, 0.18093954], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961016, 0.5003898 ], dtype=float32), array([0.40721533, 0.22140993, 0.19039874, 0.18097602], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49967504, 0.50032496], dtype=float32), array([0.4064367 , 0.2216693 , 0.19058758, 0.1813064 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4997142 , 0.50028586], dtype=float32), array([0.40842587, 0.22091377, 0.18887547, 0.1817849 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.499606 , 0.5003939], dtype=float32), array([0.40995643, 0.22034892, 0.18978076, 0.1799139 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957576, 0.50042427], dtype=float32), array([0.4089625 , 0.22075522, 0.18970697, 0.18057537], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49950403, 0.5004959 ], dtype=float32), array([0.40900645, 0.22090274, 0.19027354, 0.17981721], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49963394, 0.50036603], dtype=float32), array([0.40999842, 0.21992333, 0.18908808, 0.18099013], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49955198, 0.50044805], dtype=float32), array([0.41138074, 0.21956424, 0.18895847, 0.18009657], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49962056, 0.50037944], dtype=float32), array([0.4088858 , 0.22056046, 0.1898985 , 0.18065524], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49982595, 0.50017405], dtype=float32), array([0.408058  , 0.22043769, 0.18920189, 0.18230245], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49964488, 0.50035506], dtype=float32), array([0.4077515 , 0.22097786, 0.19012812, 0.18114258], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49963295, 0.5003671 ], dtype=float32), array([0.40690556, 0.22158886, 0.19036616, 0.18113944], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995492, 0.5004508], dtype=float32), array([0.40977347, 0.22054105, 0.18923934, 0.18044612], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957845, 0.5004216 ], dtype=float32), array([0.4115767 , 0.2197543 , 0.1891437 , 0.17952535], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995751, 0.5004249], dtype=float32), array([0.40926778, 0.22048222, 0.18982022, 0.18042979], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957007, 0.50042987], dtype=float32), array([0.40876976, 0.22061938, 0.18982904, 0.18078175], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961448, 0.5003855 ], dtype=float32), array([0.41026098, 0.2200367 , 0.18957151, 0.18013075], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996596 , 0.50034034], dtype=float32), array([0.4101482 , 0.21982837, 0.18870358, 0.18131982], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960726, 0.50039274], dtype=float32), array([0.41033247, 0.21998982, 0.18897726, 0.18070042], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960667, 0.50039333], dtype=float32), array([0.40671006, 0.22161329, 0.19064055, 0.18103617], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995653 , 0.50043464], dtype=float32), array([0.40821704, 0.22101437, 0.18984097, 0.18092763], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958465, 0.5004154 ], dtype=float32), array([0.4073017 , 0.22118945, 0.19040222, 0.18110664], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49964628, 0.5003537 ], dtype=float32), array([0.41084853, 0.21999861, 0.18878722, 0.18036571], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995666, 0.5004333], dtype=float32), array([0.4105153 , 0.21994452, 0.18907587, 0.1804643 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49964896, 0.5003511 ], dtype=float32), array([0.41145992, 0.2197132 , 0.188896  , 0.17993087], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49953854, 0.5004614 ], dtype=float32), array([0.40674877, 0.22145715, 0.19052061, 0.18127342], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49973762, 0.5002624 ], dtype=float32), array([0.40981853, 0.21993212, 0.18914254, 0.18110688], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49965554, 0.50034446], dtype=float32), array([0.4109016 , 0.2199191 , 0.18879694, 0.18038236], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49954626, 0.50045377], dtype=float32), array([0.40714347, 0.2213363 , 0.19055608, 0.1809642 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961236, 0.5003876 ], dtype=float32), array([0.4095589 , 0.22060469, 0.18908015, 0.18075629], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996457, 0.5003543], dtype=float32), array([0.41337943, 0.21883045, 0.18739066, 0.18039948], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49959755, 0.50040245], dtype=float32), array([0.40959105, 0.22013445, 0.1893505 , 0.18092401], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960467, 0.5003953 ], dtype=float32), array([0.40654552, 0.22155003, 0.19073625, 0.18116821], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4997539 , 0.50024617], dtype=float32), array([0.41170913, 0.2193569 , 0.1881008 , 0.18083324], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49951777, 0.5004822 ], dtype=float32), array([0.40855038, 0.22072054, 0.1898438 , 0.1808853 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49948296, 0.5005171 ], dtype=float32), array([0.40838775, 0.22110666, 0.19029008, 0.18021548], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49952877, 0.5004712 ], dtype=float32), array([0.40891212, 0.22071348, 0.18977314, 0.18060133], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49952358, 0.5004764 ], dtype=float32), array([0.40552044, 0.22189309, 0.19092464, 0.18166184], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957928, 0.5004207 ], dtype=float32), array([0.41011217, 0.22028714, 0.18960838, 0.17999233], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995889, 0.5004111], dtype=float32), array([0.40837973, 0.22078478, 0.18993823, 0.18089722], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995434, 0.5004566], dtype=float32), array([0.40918994, 0.22052869, 0.19008261, 0.18019879], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961478, 0.50038517], dtype=float32), array([0.41083378, 0.21964636, 0.1892268 , 0.18029307], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49970302, 0.50029695], dtype=float32), array([0.41235062, 0.21913947, 0.18810154, 0.18040834], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49977243, 0.5002276 ], dtype=float32), array([0.4117332 , 0.21934617, 0.188541  , 0.18037966], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960563, 0.50039434], dtype=float32), array([0.41083467, 0.21986349, 0.18901616, 0.18028562], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49955308, 0.500447  ], dtype=float32), array([0.40890485, 0.22057337, 0.1897644 , 0.18075733], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956617, 0.5004338 ], dtype=float32), array([0.411371  , 0.21973561, 0.18910098, 0.17979239], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.499583  , 0.50041705], dtype=float32), array([0.41232646, 0.219593  , 0.18838875, 0.17969173], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960816, 0.50039184], dtype=float32), array([0.40991786, 0.22041616, 0.18926431, 0.18040165], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49958465, 0.5004154 ], dtype=float32), array([0.40771243, 0.22125953, 0.19023623, 0.18079181], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49962133, 0.5003787 ], dtype=float32), array([0.4090409 , 0.22056141, 0.189719  , 0.18067871], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957383, 0.5004262 ], dtype=float32), array([0.40811187, 0.22093698, 0.18993755, 0.18101358], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960068, 0.50039935], dtype=float32), array([0.4088281 , 0.22070923, 0.18986559, 0.18059708], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996853 , 0.50031465], dtype=float32), array([0.41164792, 0.21965905, 0.18876575, 0.17992729], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49954015, 0.5004599 ], dtype=float32), array([0.41036484, 0.22021726, 0.18877372, 0.18064426], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49966872, 0.50033134], dtype=float32), array([0.40946352, 0.22042346, 0.18958995, 0.18052305], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956912, 0.5004309 ], dtype=float32), array([0.40952337, 0.2207007 , 0.18905193, 0.18072398], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996834, 0.5003166], dtype=float32), array([0.40727025, 0.22104241, 0.18956713, 0.18212017], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49955472, 0.5004453 ], dtype=float32), array([0.40916055, 0.22074799, 0.1899456 , 0.18014582], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956685, 0.5004331 ], dtype=float32), array([0.40648818, 0.22165142, 0.19099432, 0.1808661 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49962497, 0.50037503], dtype=float32), array([0.41070285, 0.2200665 , 0.18902926, 0.18020138], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995841, 0.5004159], dtype=float32), array([0.41082442, 0.21989322, 0.18920703, 0.18007539], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49963892, 0.500361  ], dtype=float32), array([0.4068872 , 0.2208689 , 0.18964216, 0.18260178], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49953178, 0.5004682 ], dtype=float32), array([0.40728536, 0.22122467, 0.19052269, 0.18096727], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995496 , 0.50045043], dtype=float32), array([0.4092607 , 0.22046284, 0.18966632, 0.1806102 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956802, 0.500432  ], dtype=float32), array([0.40854338, 0.22068316, 0.18988724, 0.18088625], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49964482, 0.5003551 ], dtype=float32), array([0.40860865, 0.22048572, 0.18970956, 0.18119611], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996239, 0.5003761], dtype=float32), array([0.4101805 , 0.22021368, 0.188949  , 0.18065676], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957284, 0.5004271 ], dtype=float32), array([0.40911856, 0.2205796 , 0.18963708, 0.18066476], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49962908, 0.500371  ], dtype=float32), array([0.40691003, 0.22143425, 0.19081005, 0.18084565], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961454, 0.5003854 ], dtype=float32), array([0.410688  , 0.21981618, 0.18906997, 0.18042587], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956408, 0.50043595], dtype=float32), array([0.41069978, 0.22018857, 0.18911406, 0.17999756], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49948496, 0.50051504], dtype=float32), array([0.40928552, 0.22061911, 0.18966036, 0.180435  ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49967137, 0.50032866], dtype=float32), array([0.40881363, 0.22072008, 0.18961658, 0.18084969], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995969, 0.5004031], dtype=float32), array([0.41250652, 0.21931075, 0.18870322, 0.1794795 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996436, 0.5003564], dtype=float32), array([0.4087751 , 0.22056381, 0.18994263, 0.18071847], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995475, 0.5004525], dtype=float32), array([0.40870538, 0.22079714, 0.18950339, 0.18099412], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49956653, 0.5004335 ], dtype=float32), array([0.40837163, 0.22084726, 0.18962291, 0.1811582 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996216 , 0.50037843], dtype=float32), array([0.40949008, 0.22040863, 0.18963194, 0.18046941], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.499617, 0.500383], dtype=float32), array([0.40749162, 0.22114244, 0.19012405, 0.18124187], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996031, 0.5003969], dtype=float32), array([0.40831825, 0.22095029, 0.19016798, 0.18056339], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49949136, 0.5005086 ], dtype=float32), array([0.40908265, 0.22085227, 0.19011855, 0.1799465 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996858 , 0.50031424], dtype=float32), array([0.41145116, 0.21943563, 0.18881218, 0.18030103], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996176, 0.5003824], dtype=float32), array([0.40998188, 0.22005974, 0.18950365, 0.18045476], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49962682, 0.50037324], dtype=float32), array([0.40870675, 0.22061226, 0.19018796, 0.18049304], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957865, 0.5004213 ], dtype=float32), array([0.4101431 , 0.22035818, 0.18906999, 0.18042867], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995876, 0.5004124], dtype=float32), array([0.40938836, 0.22047637, 0.18980545, 0.18032984], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957803, 0.50042194], dtype=float32), array([0.4126287 , 0.21905403, 0.18864194, 0.17967534], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49955624, 0.5004438 ], dtype=float32), array([0.41064397, 0.22006357, 0.18931885, 0.17997359], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49961683, 0.50038314], dtype=float32), array([0.40873367, 0.22061744, 0.18997027, 0.1806786 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49962327, 0.5003767 ], dtype=float32), array([0.41105926, 0.21959494, 0.18883468, 0.18051107], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960542, 0.5003945 ], dtype=float32), array([0.41011763, 0.22027424, 0.18991992, 0.1796883 ], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996076 , 0.50039244], dtype=float32), array([0.41099063, 0.21992673, 0.18923481, 0.17984776], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49960563, 0.50039434], dtype=float32), array([0.41168737, 0.21983244, 0.18861721, 0.17986295], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49957845, 0.5004216 ], dtype=float32), array([0.4084398 , 0.2207433 , 0.18993436, 0.18088254], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4995912, 0.5004088], dtype=float32), array([0.4082383 , 0.22084796, 0.18985192, 0.18106171], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49968106, 0.50031894], dtype=float32), array([0.40801483, 0.2206995 , 0.19000837, 0.18127726], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.49976873, 0.5002313 ], dtype=float32), array([0.40947932, 0.22017527, 0.189058  , 0.18128747], dtype=float32)]
gender ID : 1 , age ID : 0
output :  [array([0.4996144, 0.5003856], dtype=float32), array([0.40994203, 0.22003256, 0.18936092, 0.18066445], dtype=float32)]
gender ID : 1 , age ID : 0


Thanks.

Could you please elaborate more about “biased result”?

Yes, Actually I have 2 task 1-gender (female,male), 2-age(0-15,16-35,36-55,55+). In Deep-stream I am getting results for all classes like male 0-15, male16-35, male 36-55,male 55+ and same for female with more than 50 percent confidence. But with the script which I have mentioned above I am always getting female 0-15. You can see the output above that the score is always near to 0.50 and 0.40 and it is constant for all the images.

I have verified the input images too. Same image getting classified correct in DS app but wrong using custom script. Engine file is also same in DS and Custom script.

Thanks.

Hi @Morganh,

Did you get any clue why this is happening ? or there is issue with the python script.

Thanks.

Can you run tao multitask_classification inference or tlt multitask_classification inference successfully?
https://docs.nvidia.com/tao/tao-toolkit/text/multitask_image_classification.html#running-inference-on-a-model

Actually I had only run the evaluate and then export etlt and convert it into engine for NX-Xavier and on NX I am checking the results. Same engine file with DS-Sample app giving good result but worst with script.

I will have to try tao multitask_classification inference on server. Will update you once it will run successfully. but if tao multitask_classification inference will give the good result then we will able to get any clue ? because evaluate was working fine and giving me the proper confusion matrix.

Thanks.

OK, thanks for the info. For “DS-Sample app”, do you mean https://github.com/NVIDIA-AI-IOT/deepstream_tao_apps ?

Please refer to Inferring resnet18 classification etlt model with python - #41 by Morganh

Okay Thanks.

More, please try to leverage classification inference in triton server. See Integrating TAO CV Models with Triton Inference Server — TAO Toolkit 3.0 documentation .
Please check if it is useful for you or if you expect a pure trt sample.