Issue of running objectDetector_SSD in deepstream 5 on Jetson NX

I followed the README file in /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD to run in on Jetson NX.

The application hangs with the following message. How to fix it?

Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
**PERF: 0.00 (0.00)
** INFO: <bus_callback:167>: Pipeline running

Could not find NMS layer buffer while parsing
0:01:44.702495757 8993 0xd2d0b20 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::fillDetectionOutput() <nvdsinfer_context_impl_output_parsing.cpp:564> [UID = 1]: Failed to parse bboxes using custom parse function

I tried to run it with the following command following README. It still hangs.

gst-launch-1.0 filesrc location=…/…/samples/streams/sample_1080p_h264.mp4 ! decodebin ! m.sink_0 nvstreammux name=m batch-size=1 width=1280 height=720 ! nvinfer config-file-path= config_infer_primary_ssd.txt ! nvvideoconvert ! nvdsosd ! nvegltransform ! nveglglessink
Warn: ‘threshold’ parameter has been deprecated. Use ‘pre-cluster-threshold’ instead.
Setting pipeline to PAUSED …

Using winsys: x11
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/sample_ssd_relu6.uff_b1_gpu0_fp32.engine open error
0:00:01.803197041 10228 0x5576e8eea0 WARN nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1566> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/sample_ssd_relu6.uff_b1_gpu0_fp32.engine failed
0:00:01.803342292 10228 0x5576e8eea0 WARN nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1673> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/sample_ssd_relu6.uff_b1_gpu0_fp32.engine failed, try rebuild
0:00:01.803376341 10228 0x5576e8eea0 INFO nvinfer gstnvinfer.cpp:602:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1591> [UID = 1]: Trying to create engine from model files
INFO: [TRT]:
INFO: [TRT]: --------------- Layers running on DLA:
INFO: [TRT]:
INFO: [TRT]: --------------- Layers running on GPU:
INFO: [TRT]: GridAnchor, FeatureExtractor/InceptionV2/InceptionV2/Conv2d_1a_7x7/separable_conv2d/depthwise, FeatureExtractor/InceptionV2/InceptionV2/Conv2d_1a_7x7/separable_conv2d + FeatureExtractor/InceptionV2/InceptionV2/Conv2d_1a_7x7/Relu6, FeatureExtractor/InceptionV2/InceptionV2/MaxPool_2a_3x3/MaxPool, FeatureExtractor/InceptionV2/InceptionV2/Conv2d_2b_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Conv2d_2b_1x1/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Conv2d_2c_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Conv2d_2c_3x3/Relu6, FeatureExtractor/InceptionV2/InceptionV2/MaxPool_3a_3x3/MaxPool, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_3/AvgPool_0a_3x3/AvgPool, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_3/Conv2d_0b_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_3/Conv2d_0b_1x1/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_2/Conv2d_0a_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_2/Conv2d_0a_1x1/Relu6 || FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_1/Conv2d_0a_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_1/Conv2d_0a_1x1/Relu6 || FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_0/Conv2d_0a_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_0/Conv2d_0a_1x1/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_1/Conv2d_0b_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_1/Conv2d_0b_3x3/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_2/Conv2d_0b_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_2/Conv2d_0b_3x3/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_2/Conv2d_0c_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_2/Conv2d_0c_3x3/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3b/Branch_0/Conv2d_0a_1x1/Relu6 copy, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_3/AvgPool_0a_3x3/AvgPool, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_3/Conv2d_0b_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_3/Conv2d_0b_1x1/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_2/Conv2d_0a_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_2/Conv2d_0a_1x1/Relu6 || FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_1/Conv2d_0a_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_1/Conv2d_0a_1x1/Relu6 || FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_0/Conv2d_0a_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_0/Conv2d_0a_1x1/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_1/Conv2d_0b_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_1/Conv2d_0b_3x3/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_2/Conv2d_0b_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_2/Conv2d_0b_3x3/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_2/Conv2d_0c_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_2/Conv2d_0c_3x3/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_3c/Branch_0/Conv2d_0a_1x1/Relu6 copy, FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_2/MaxPool_1a_3x3/MaxPool, FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_1/Conv2d_0a_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_1/Conv2d_0a_1x1/Relu6 || FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_0/Conv2d_0a_1x1/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_0/Conv2d_0a_1x1/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_0/Conv2d_1a_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_0/Conv2d_1a_3x3/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_1/Conv2d_0b_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_1/Conv2d_0b_3x3/Relu6, FeatureExtractor/InceptionV2/InceptionV2/Mixed_4a/Branch_1/Conv2d_1a_3x3/Conv2D + FeatureExtractor/InceptionV2/InceptionV2/Mixed_
INFO: [TRT]: Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output.
INFO: [TRT]: Detected 1 inputs and 1 output network tensors.
0:01:31.925956049 10228 0x5576e8eea0 INFO nvinfer gstnvinfer.cpp:602:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1624> [UID = 1]: serialize cuda engine to file: /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/sample_ssd_relu6.uff_b1_gpu0_fp16.engine successfully
INFO: [Implicit Engine Info]: layers num: 2
0 INPUT kFLOAT Input 3x300x300
1 OUTPUT kFLOAT add_6 1x100x7

ERROR: [TRT]: INVALID_ARGUMENT: Cannot find binding of given name: MarkOutput_0
0:01:31.936409438 10228 0x5576e8eea0 WARN nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::checkBackendParams() <nvdsinfer_context_impl.cpp:1545> [UID = 1]: Could not find output layer ‘MarkOutput_0’ in engine
0:01:32.026955638 10228 0x5576e8eea0 INFO nvinfer gstnvinfer_impl.cpp:311:notifyLoadModelStatus: [UID 1]: Load new model:config_infer_primary_ssd.txt sucessfully
Pipeline is PREROLLING …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;

(gst-launch-1.0:10228): GStreamer-WARNING **: 16:43:57.079: Failed to load plugin ‘/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstlibav.so’: /usr/lib/aarch64-linux-gnu/libgomp.so.1: cannot allocate memory in static TLS block

(gst-launch-1.0:10228): GStreamer-WARNING **: 16:43:57.100: Failed to load plugin ‘/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstlibav.so’: /usr/lib/aarch64-linux-gnu/libgomp.so.1: cannot allocate memory in static TLS block
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
Could not find NMS layer buffer while parsing
0:01:33.935522844 10228 0x5576b474f0 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::fillDetectionOutput() <nvdsinfer_context_impl_output_parsing.cpp:564> [UID = 1]: Failed to parse bboxes using custom parse function
Caught SIGSEGV
#0 0x0000007f9c2b6048 in __GI___poll (fds=0x558dc50220, nfds=548082037360, timeout=) at …/sysdeps/unix/sysv/linux/poll.c:41
#1 0x0000007f9c3c2e40 in () at /usr/lib/aarch64-linux-gnu/libglib-2.0.so.0
#2 0x00000055769febd0 in ()
Spinning. Please run ‘gdb gst-launch-1.0 10228’ to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.

ERROR: [TRT]: INVALID_ARGUMENT: Cannot find binding of given name: MarkOutput_0
0:01:31.936409438 10228 0x5576e8eea0 WARN nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::checkBackendParams() <nvdsinfer_context_impl.cpp:1545> [UID = 1]: Could not find output layer ‘MarkOutput_0’ in engine

Can you run successfully with default SSD model?
btw, did you follow README under sample dir, sources/objectDetector_SSD