Hi,
I converted my frozen .pd posenet model in uff format. Then, when I want to create the builder, network, and parser using python with the following instructions:
with builder = trt.Builder(TRT_LOGGER) as builder, builder.create_network() as network, trt.UffParser() as parser:
parser.register_input(“Placeholder”, (1, 28, 28))
parser.register_output(“fc2/Relu”)
parser.parse(model_file, network)
I get the following error: Segmentation fault (core dumped)
Is my model supported by tensorrt or did I make another mistakes ?
Thanks
1 Like
Hi,
Usually, segmentation fault is caused by memory-related error.
Would you mind to share a complete log with us so we can check it for you?
Thanks.
Thanks for the reply,
The only error message I get is “Segmentation fault (core dumped)” and the following file is generated: core.1808 each time I run my python script.
I also tried with python sample example to run my model and get the following error:
[TensorRT] ERROR: UffParser: Could not open models/posenet_mobilenet_v1_100_257x257_multi_kpt_stripped.uff
[TensorRT] ERROR: Network must have at least one output
[TensorRT] ERROR: Network validation failed.
Then I tried
“python convert_to_uff.py posenet_mobilenet_v1_100_257x257_multi_kpt_stripped.pb -O MNS” but resulted to a fail during convertion:
raise UffException(str(name) + " was not found in the graph. Please use the -l option to list nodes in the graph.")
uff.model.exceptions.UffException: NMS was not found in the graph. Please use the -l option to list nodes in the graph.
I have no idea what kind of model the converter is waiting for running in inference next.
Hi,
The error complains about no .uff is found.
Would you mind to check if you feed the file correctly?
ERROR: UffParser: Could not open models/posenet_mobilenet_v1_100_257x257_multi_kpt_stripped.uff
Thanks.
It founds my model and now I’m getting the following error:
Order size is not matching the number dimensions of TensorRT
Building TensorRT engine, this may take a few minutes…
[TensorRT] ERROR: Network must have at least one output
[TensorRT] ERROR: Network validation failed.
During conversion I get the following message:
Loading posenet_mobilenet_v1_100_257x257_multi_kpt_stripped.pb
NOTE: UFF has been tested with TensorFlow 1.15.0.
WARNING: The version of TensorFlow installed on this system is not guaranteed to work with UFF.
UFF Version 0.6.9
=== Automatically deduced input nodes ===
[name: “sub_2”
op: “Placeholder”
attr {
key: “dtype”
value {
type: DT_FLOAT
}
}
attr {
key: “shape”
value {
shape {
dim {
size: 1
}
dim {
size: 257
}
dim {
size: 257
}
dim {
size: 3
}
}
}
}
]
=== Automatically deduced output nodes ===
[name: “MobilenetV1/heatmap_2/BiasAdd”
op: “Add”
input: “MobilenetV1/heatmap_2/BiasAdd/conv2d”
input: “MobilenetV1/heatmap_2/BiasAdd/conv_bias/read”
attr {
key: “T”
value {
type: DT_FLOAT
}
}
, name: “MobilenetV1/offset_2/BiasAdd”
op: “Add”
input: “MobilenetV1/offset_2/BiasAdd/conv2d”
input: “MobilenetV1/offset_2/BiasAdd/conv_bias/read”
attr {
key: “T”
value {
type: DT_FLOAT
}
}
, name: “MobilenetV1/displacement_bwd_2/BiasAdd”
op: “Add”
input: “MobilenetV1/displacement_bwd_2/BiasAdd/conv2d”
input: “MobilenetV1/displacement_bwd_2/BiasAdd/conv_bias/read”
attr {
key: “T”
value {
type: DT_FLOAT
}
}
, name: “MobilenetV1/displacement_fwd_2/BiasAdd”
op: “Add”
input: “MobilenetV1/displacement_fwd_2/BiasAdd/conv2d”
input: “MobilenetV1/displacement_fwd_2/BiasAdd/conv_bias/read”
attr {
key: “T”
value {
type: DT_FLOAT
}
}
]
Using output node MobilenetV1/heatmap_2/BiasAdd
Using output node MobilenetV1/offset_2/BiasAdd
Using output node MobilenetV1/displacement_bwd_2/BiasAdd
Using output node MobilenetV1/displacement_fwd_2/BiasAdd
Converting to UFF graph
DEBUG […/…/uff/converters/tensorflow/converter.py:143] Marking [‘MobilenetV1/heatmap_2/BiasAdd’, ‘MobilenetV1/offset_2/BiasAdd’, ‘MobilenetV1/displacement_bwd_2/BiasAdd’, ‘MobilenetV1/displacement_fwd_2/BiasAdd’] as outputs
No. nodes: 156
UFF Output written to posenet_mobilenet_v1_100_257x257_multi_kpt_stripped.uff
I have no idea where the problem comes from because the outputs are well detected like heatmap and offset outputs from the posenet model.
Before building TensorRT engine: the command line parser.parse(uff_model_path, network) results as “False”
Script:with trt.Builder(TRT_LOGGER) as builder, builder.create_network() as network, trt.UffParser() as parser:
parser.register_input(“Placeholder”, (3, 257, 257))
parser.register_output(“fc2/Relu”)
parser.parse(uff_model_path, network)
Hi,
It looks like the conversion between .pb->.uff
is good but .uff->.engine
is not working.
Is our understanding correct?
According to the code you shared, the output layer name is not synchronized.
Would you mind to update it first?
parser.register_output("MobilenetV1/heatmap_2/BiasAdd")
parser.register_output("MobilenetV1/offset_2/BiasAdd")
parser.register_output("MobilenetV1/displacement_bwd_2/BiasAdd")
parser.register_output("MobilenetV1/displacement_fwd_2/BiasAdd")
Thanks.
Hi,
I tried with the good names output but still exactly the same error:
[TensorRT] ERROR: UffParser: Parser error: MobilenetV1/Conv2d_0/Relu6/conv2d: Order size is not matching the number dimensions of TensorRT
Building TensorRT engine, this may take a few minutes…
[TensorRT] ERROR: Network must have at least one output
[TensorRT] ERROR: Network validation failed.
I saw that some operations are not supported by Tenssort, so need to use TF_TRT instead.