download the dusty jetson inference download the code
capture my input and annotation
train the model using ssd-MobilenetV2
And get 5.89 loss is good checkpoint for my concern
so I was convert onnx file
finally detect the object using
This process work in general coco dataset ! but custom dataset all process are good and stream the tensorRT video but didnot detect the my custom object
What is the reason! And suggest any solution guys!
Are you still using jetson inference? I am using labelimg to annotate ‘Tea bottle’ and ‘Soya Bottle’ but after training and exporting it to onnx I encountered the same issue as you previously. It detects coco dataset and changes the names to ‘Tea bottle’ and ‘Soya Bottle’. May I know what to do in order to fix this? I am training a custom dataset and only want to detect Tea and Soya bottle but not the coco dataset.