Custom trained model detectNet jetson_inference

Oddly, when I change the order of parameters to this:

net = jetson.inference.detectNet(argv=['--model=Cones-and-Cells/ssd-mobilenet.onnx --labels=Cones-and-Cells/labels.txt --input-blob=input_0 --output-cvg=scores --output-bbox=boxes'])

The message changes to this:

[TRT] detected model format - custom (extension '.txt --input-blob=input_0 --output-cvg=scores --output-bbox=boxes')

It’s like the parser didn’t recognize any of the spaces in the string.