Lpr tlt-converter segmentation fault

./tlt-converter -k nvidia_tlt -p image_input,1x3x48x96,4x3x48x96,16x3x48x96 models/LP/LPR/us_lprnet_baseline18_deployable.etlt -t fp16 -e models/LP/LPR/lpr_us_onnx_b16.engine
[WARNING] onnx2trt_utils.cpp:217: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[WARNING] Tensor DataType is determined at build time for tensors not marked as input or output.
[WARNING] Calling isShapeTensor before the entire network is constructed may result in an inaccurate result.
[WARNING] Calling isShapeTensor before the entire network is constructed may result in an inaccurate result.
[WARNING] Calling isShapeTensor before the entire network is constructed may result in an inaccurate result.
[WARNING] Calling isShapeTensor before the entire network is constructed may result in an inaccurate result.
[WARNING] Calling isShapeTensor before the entire network is constructed may result in an inaccurate result.
[WARNING] Calling isShapeTensor before the entire network is constructed may result in an inaccurate result.
[INFO] Detected input dimensions from the model: (-1, 3, 48, 96)
[INFO] Model has dynamic shape. Setting up optimization profiles.
[INFO] Using optimization profile min shape: (1, 3, 48, 96) for input: image_input
[INFO] Using optimization profile opt shape: (4, 3, 48, 96) for input: image_input
[INFO] Using optimization profile max shape: (16, 3, 48, 96) for input: image_input
Segmentation fault (core dumped)

Hardware: Jetson
Tensorrt: 7.1
Jetpack 4.4

I follow the steps of GitHub - NVIDIA-AI-IOT/deepstream_lpr_app: Sample app code for LPR deployment on DeepStream
and using official model.

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Where did you download the tlt-converter?
Which jetson device are you running?

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.