Perfectly fine .caffemodel file not being parsed: assert(blob_name_to_tensor) error

Good day !

I am using the Python API to TensorRT to create an inference engine:

I am working with a Caffe SSD model based on VGGNet and I have set up:

TensorRTInference.py

...
...
G_LOGGER = trt.infer.ConsoleLogger(trt.infer.LogSeverity.ERROR)

INPUT_LAYERS = ['data']
OUTPUT_LAYERS = ['detection_out']
INPUT_H = 300
INPUT_W =  300
OUTPUT_SIZE = 37

MODEL_PROTOTXT = ./deploy.prototxt 
CAFFE_MODEL = ./SSD_Recog300x300_iter_50000.caffemodel 

engine = trt.utils.caffe_to_trt_engine(G_LOGGER,
                                       MODEL_PROTOTXT,
                                       CAFFE_MODEL,
                                       1,
                                       1 << 20,
                                       OUTPUT_LAYERS,
                                       trt.infer.DataType.FLOAT) 
...
...

These files are valid and working when I have used them in Caffe’s Python API, but TensorRT returns the following error:

AssertionError: Caffe parsing failed on line 284 in statement assert(blob_name_to_tensor)

Why is my Caffe Model not being parsed correctly ?

P.S: Sory fr my bad engliz

UPDATE Here is more information on the error. Still haven’t resolved it.

[libprotobuf ERROR google/protobuf/text_format.cc:298] Error parsing text-format ditcaffe.NetParameter: 817:14: Message type "ditcaffe.LayerParameter" has no field named "norm_param".
Could not parse deploy file
[TensorRT] ERROR: Failed to parse caffe model

Hi.I also encountered this problem. Have you solved it?

Same here, can anyone provide support?

@blackvitriol
Hello, I got same result, dont know how to do it, did you have fix it?

I am encountering the same issue, did anyone find a solution to this yet?

Has anyone solved this issue. If so, can you please help me on it.