If I let it use the default neural net, the system works fine. But if I manually tell it to use the default net (or any other net):
detectnet-camera.py --camera 0 --network ssd-mobilenet-v2
or
./detectnet-camera.py --camera 0 --network ssd-mobilenet-v2
then it fails with the error:
[TRT] detected model format - custom (extension ‘.’)
[TRT] model format ‘custom’ not supported by jetson-inference
detectNet – failed to initialize.
jetson.inference – detectNet failed to load built-in network ‘ssd-mobilenet-v2’
PyTensorNet_Dealloc()
Traceback (most recent call last):
File “./detectnet-camera.py”, line 49, in
net = jetson.inference.detectNet(opt.network, sys.argv, opt.threshold)
Exception: jetson.inference – detectNet failed to load network
Hi,
It looks like the parameter doesn’t be parsed correctly.
[TRT] detected model format - custom (extension '.')
Could you try to use ‘–network=ssd-mobilenet-v2’ instead?
Thanks.