back-to-back-detectors on Jetson TX2

I try to run back-to-back-detectors from https://github.com/NVIDIA-AI-IOT/deepstream_reference_apps on Jetson TX2 (No HDMI Connect ; DS 4.02) but error occur after frame 0 like this.

n@Jetson-TX2:/opt/nvidia/deepstream/deepstream-4.0/sources/apps/sample_apps/deepstream_reference_apps/back-to-back-detectors$ ./back-to-back-detectors ./…/…/…/…/…/samples/streams/sample_720p.h264
Now playing: ./…/…/…/…/…/samples/streams/sample_720p.h264

Using winsys: x11
Opening in BLOCKING MODE
Creating LL OSD context new
Running…
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
Creating LL OSD context new
Frame Number = 0 Vehicle Count = 2 Person Count = 2 Face Count = 0 License Plate Count = 0
0:00:09.801626888 22468 0x558ea2a140 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop: error: Internal data stream error.
0:00:09.801685320 22468 0x558ea2a140 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop: error: streaming stopped, reason error (-5)
ERROR from element primary-nvinference-engine2: Internal data stream error.
Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(1830): gst_nvinfer_output_loop (): /GstPipeline:pipeline/GstNvInfer:primary-nvinference-engine2:
streaming stopped, reason error (-5)
Returned, stopping playback
0:00:09.806464617 22468 0x558ea2a320 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop: error: Internal data stream error.
0:00:09.806590185 22468 0x558ea2a320 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop: error: streaming stopped, reason error (-5)
Deleting pipeline

How do i solve this problem?
Thanks

Since back-to-back-detectors use EGL display, you need to have a HDMI display connected and

  1. run this program on display terminal
    or
  2. “export DISPLAY=:1” and then run the program on remote ternimal, e.g. ssh.

Without, I can reproduce the same failure as you see.

@mchi Thanks you for your answer.