Unable to use camera and jetson-inference without HDMI monitor conected

Hello everybody,
I am trying to use jetson inference to perform pose detection through a camera connected via USB. Unfortunately, I cannot always have an HDMI monitor connected to my Jetson, so I thought about using TeamViewer. I set up a dummy screen to enable it to work properly without HDMI connected.
As the docs say, I tried to run posenet /dev/video0. First I tried with the HDMI connected and it worked. Then I configured the Jetson to use the dummy screen, tried again and it does not work: a black window, where the camera stream should be displayed, is opend and then closed with the error “Exception: jetson.utils – videoSource failed to capture image”. I thought it could be a problem to use a camera without a monitor connected, but if I try to see the camera stream through VLC it works perfectly even when no HDMI is plugged in.

Hi,
Dummy screen is not validated and may not work properly. You may try to enable RTSP and stream out the frames.

Hi,
After some tries, the best I obtained is that:

  1. If I start an rtsp streaming on another laptop, I am able to see it on the Jetson using ffplay rtsp://<laptop-ip>:8554/stream; anyway, if I run posenet rtsp://<laptop-ip>:8554/stream I get
[gstreamer] gstDecoder -- failed to retrieve next image buffer
posenet: failed to capture next frame
  1. If instead I connect a webcam to the jetson and run posenet /dev/video0, it starts working and also opens the TensorRT window where I should see the image, but it is completely black and I get this message repeatedly:
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaYUV-NV12.cu:154
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaColorspace.cpp:42
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/codec/gstBufferManager.cpp:435
[gstreamer] gstBufferManager -- unsupported image format (rgb8)
[gstreamer]                     supported formats are:
[gstreamer]                        * rgb8
[gstreamer]                        * rgba8
[gstreamer]                        * rgb32f
[gstreamer]                        * rgba32f
[gstreamer] gstDecoder -- failed to retrieve next image buffer
posenet: failed to capture next frame
posenet: detected 1 person(s)
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/display/glTexture.cpp:360
[gstreamer] gstreamer message qos ==> v4l2src0

[TRT]    ------------------------------------------------
[TRT]    Timing Report /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx
[TRT]    ------------------------------------------------
[TRT]    Pre-Process   CPU   0.03181ms  CUDA   0.08525ms
[TRT]    Network       CPU  17.65248ms  CUDA  16.42086ms
[TRT]    Post-Process  CPU   0.97236ms  CUDA   0.96922ms
[TRT]    Visualize     CPU   0.21245ms  CUDA   0.22294ms
[TRT]    Total         CPU  18.86910ms  CUDA  17.69827ms
[TRT]    ------------------------------------------------

So it looks like it is working, but not showing the image.

  1. I tried to forward the output to an RTSP server, using posenet /dev/video0 rtsp://@:8554/stream but when I try to open it with VLC or ffplay, I get “Connection refused”. I added a rule in iptables to allow connections on port 8554 from local network, but did not solve

Any other suggestions?

Hi @defalco , can you post higher up in the log where the initial error occurred? Usually the general CUDA ‘999’ error comes after another error. Or if this is OpenGL related, since it occurs only with window, can you check glxinfo utility and make sure NVIDIA OpenGL driver is still being properly used?

IIRC this is normal to happen a few times at startup until the RTSP stream negotates/syncs and GStreamer actually starts receiving the frames. I’m guessing in your case that never happens? Are you able to test it by using GStreamer on the other end, to send the stream?

I have some notes from testing this during development here:

Hi,
here’s the complete log:

[gstreamer] initialized gstreamer, version 1.16.3.0
[gstreamer] gstCamera -- attempting to create device v4l2:///dev/video0
[gstreamer] gstCamera -- found v4l2 device: Logitech BRIO
[gstreamer] v4l2-proplist, device.path=(string)/dev/video2, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)"Logitech\ BRIO", v4l2.device.bus_info=(string)usb-3610000.xhci-3, v4l2.device.version=(uint)330344, v4l2.device.capabilities=(uint)2225078273, v4l2.device.device_caps=(uint)69206017;
[gstreamer] gstCamera -- found v4l2 device: Logitech BRIO
[gstreamer] v4l2-proplist, device.path=(string)/dev/video0, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)"Logitech\ BRIO", v4l2.device.bus_info=(string)usb-3610000.xhci-3, v4l2.device.version=(uint)330344, v4l2.device.capabilities=(uint)2225078273, v4l2.device.device_caps=(uint)69206017;
[gstreamer] gstCamera -- found 43 caps for v4l2 device /dev/video0
[gstreamer] [0] video/x-raw, format=(string)YUY2, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [1] video/x-raw, format=(string)YUY2, width=(int)1600, height=(int)896, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [2] video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [3] video/x-raw, format=(string)YUY2, width=(int)1024, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [4] video/x-raw, format=(string)YUY2, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [5] video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [6] video/x-raw, format=(string)YUY2, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [7] video/x-raw, format=(string)YUY2, width=(int)800, height=(int)448, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [8] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [9] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [10] video/x-raw, format=(string)YUY2, width=(int)440, height=(int)440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
[gstreamer] [11] video/x-raw, format=(string)YUY2, width=(int)480, height=(int)270, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [12] video/x-raw, format=(string)YUY2, width=(int)340, height=(int)340, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
[gstreamer] [13] video/x-raw, format=(string)YUY2, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [14] video/x-raw, format=(string)YUY2, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [15] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [16] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [17] video/x-raw, format=(string)YUY2, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [18] video/x-raw, format=(string)YUY2, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [19] image/jpeg, width=(int)4096, height=(int)2160, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [20] image/jpeg, width=(int)3840, height=(int)2160, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [21] image/jpeg, width=(int)2560, height=(int)1440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [22] image/jpeg, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [23] image/jpeg, width=(int)1600, height=(int)896, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [24] image/jpeg, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 90/1, 60/1, 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [25] image/jpeg, width=(int)1024, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [26] image/jpeg, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [27] image/jpeg, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [28] image/jpeg, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [29] image/jpeg, width=(int)800, height=(int)448, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [30] image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 120/1, 90/1, 60/1, 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [31] image/jpeg, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [32] image/jpeg, width=(int)480, height=(int)270, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [33] image/jpeg, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [34] image/jpeg, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [35] image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [36] image/jpeg, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [37] image/jpeg, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [38] image/jpeg, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [39] video/x-raw, format=(string)NV12, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [40] video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [41] video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [42] video/x-raw, format=(string)NV12, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] gstCamera -- selected device profile:  codec=raw format=nv12 width=1280 height=720
[gstreamer] gstCamera pipeline string:
[gstreamer] v4l2src device=/dev/video0 do-timestamp=true ! video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720 ! appsink name=mysink
[gstreamer] gstCamera successfully created device v4l2:///dev/video0
[video]  created gstCamera from v4l2:///dev/video0
------------------------------------------------
gstCamera video options:
------------------------------------------------
  -- URI: v4l2:///dev/video0
     - protocol:  v4l2
     - location:  /dev/video0
  -- deviceType: v4l2
  -- ioType:     input
  -- codec:      raw
  -- width:      1280
  -- height:     720
  -- frameRate:  30.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------
[OpenGL] glDisplay -- X screen 0 resolution:  1920x1080
[OpenGL] glDisplay -- X window resolution:    1920x1080
[OpenGL] glDisplay -- display device initialized (1920x1080)
[video]  created glDisplay from display://0
------------------------------------------------
glDisplay video options:
------------------------------------------------
  -- URI: display://0
     - protocol:  display
     - location:  0
  -- deviceType: display
  -- ioType:     output
  -- codec:      raw
  -- width:      1920
  -- height:     1080
  -- frameRate:  0.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------

poseNet -- loading pose estimation model from:
        -- model        networks/Pose-ResNet18-Body/pose_resnet18_body.onnx
        -- topology     networks/Pose-ResNet18-Body/human_pose.json
        -- colors       networks/Pose-ResNet18-Body/colors.txt
        -- input_blob   'input'
        -- output_cmap  'cmap'
        -- output_paf   'paf'
        -- threshold    0.150000
        -- batch_size   1

[TRT]    topology -- keypoint 0  nose
[TRT]    topology -- keypoint 1  left_eye
[TRT]    topology -- keypoint 2  right_eye
[TRT]    topology -- keypoint 3  left_ear
[TRT]    topology -- keypoint 4  right_ear
[TRT]    topology -- keypoint 5  left_shoulder
[TRT]    topology -- keypoint 6  right_shoulder
[TRT]    topology -- keypoint 7  left_elbow
[TRT]    topology -- keypoint 8  right_elbow
[TRT]    topology -- keypoint 9  left_wrist
[TRT]    topology -- keypoint 10  right_wrist
[TRT]    topology -- keypoint 11  left_hip
[TRT]    topology -- keypoint 12  right_hip
[TRT]    topology -- keypoint 13  left_knee
[TRT]    topology -- keypoint 14  right_knee
[TRT]    topology -- keypoint 15  left_ankle
[TRT]    topology -- keypoint 16  right_ankle
[TRT]    topology -- keypoint 17  neck
[TRT]    topology -- skeleton link 0  16 14
[TRT]    topology -- skeleton link 1  14 12
[TRT]    topology -- skeleton link 2  17 15
[TRT]    topology -- skeleton link 3  15 13
[TRT]    topology -- skeleton link 4  12 13
[TRT]    topology -- skeleton link 5  6 8
[TRT]    topology -- skeleton link 6  7 9
[TRT]    topology -- skeleton link 7  8 10
[TRT]    topology -- skeleton link 8  9 11
[TRT]    topology -- skeleton link 9  2 3
[TRT]    topology -- skeleton link 10  1 2
[TRT]    topology -- skeleton link 11  1 3
[TRT]    topology -- skeleton link 12  2 4
[TRT]    topology -- skeleton link 13  3 5
[TRT]    topology -- skeleton link 14  4 6
[TRT]    topology -- skeleton link 15  5 7
[TRT]    topology -- skeleton link 16  18 1
[TRT]    topology -- skeleton link 17  18 6
[TRT]    topology -- skeleton link 18  18 7
[TRT]    topology -- skeleton link 19  18 12
[TRT]    topology -- skeleton link 20  18 13
[TRT]    poseNet -- keypoint 00 'nose'  color 255 0 85 255
[TRT]    poseNet -- keypoint 01 'left_eye'  color 255 0 0 255
[TRT]    poseNet -- keypoint 02 'right_eye'  color 255 85 0 255
[TRT]    poseNet -- keypoint 03 'left_ear'  color 255 170 0 255
[TRT]    poseNet -- keypoint 04 'right_ear'  color 255 255 0 255
[TRT]    poseNet -- keypoint 05 'left_shoulder'  color 170 255 0 255
[TRT]    poseNet -- keypoint 06 'right_shoulder'  color 85 255 0 255
[TRT]    poseNet -- keypoint 07 'left_elbow'  color 0 255 0 255
[TRT]    poseNet -- keypoint 08 'right_elbow'  color 0 255 85 255
[TRT]    poseNet -- keypoint 09 'left_wrist'  color 0 255 170 255
[TRT]    poseNet -- keypoint 10 'right_wrist'  color 0 255 255 255
[TRT]    poseNet -- keypoint 11 'left_hip'  color 0 170 255 255
[TRT]    poseNet -- keypoint 12 'right_hip'  color 0 85 255 255
[TRT]    poseNet -- keypoint 13 'left_knee'  color 0 0 255 255
[TRT]    poseNet -- keypoint 14 'right_knee'  color 255 0 170 255
[TRT]    poseNet -- keypoint 15 'left_ankle'  color 170 0 255 255
[TRT]    poseNet -- keypoint 16 'right_ankle'  color 255 0 255 255
[TRT]    poseNet -- keypoint 17 'neck'  color 85 0 255 255
[TRT]    poseNet -- loaded 18 class colors
[TRT]    TensorRT version 8.4.1
[TRT]    loading NVIDIA plugins...
[TRT]    Registered plugin creator - ::GridAnchor_TRT version 1
[TRT]    Registered plugin creator - ::GridAnchorRect_TRT version 1
[TRT]    Registered plugin creator - ::NMS_TRT version 1
[TRT]    Registered plugin creator - ::Reorg_TRT version 1
[TRT]    Registered plugin creator - ::Region_TRT version 1
[TRT]    Registered plugin creator - ::Clip_TRT version 1
[TRT]    Registered plugin creator - ::LReLU_TRT version 1
[TRT]    Registered plugin creator - ::PriorBox_TRT version 1
[TRT]    Registered plugin creator - ::Normalize_TRT version 1
[TRT]    Registered plugin creator - ::ScatterND version 1
[TRT]    Registered plugin creator - ::RPROI_TRT version 1
[TRT]    Registered plugin creator - ::BatchedNMS_TRT version 1
[TRT]    Registered plugin creator - ::BatchedNMSDynamic_TRT version 1
[TRT]    Registered plugin creator - ::BatchTilePlugin_TRT version 1
[TRT]    Could not register plugin creator -  ::FlattenConcat_TRT version 1
[TRT]    Registered plugin creator - ::CropAndResize version 1
[TRT]    Registered plugin creator - ::CropAndResizeDynamic version 1
[TRT]    Registered plugin creator - ::DetectionLayer_TRT version 1
[TRT]    Registered plugin creator - ::EfficientNMS_TRT version 1
[TRT]    Registered plugin creator - ::EfficientNMS_ONNX_TRT version 1
[TRT]    Registered plugin creator - ::EfficientNMS_Explicit_TF_TRT version 1
[TRT]    Registered plugin creator - ::EfficientNMS_Implicit_TF_TRT version 1
[TRT]    Registered plugin creator - ::ProposalDynamic version 1
[TRT]    Registered plugin creator - ::Proposal version 1
[TRT]    Registered plugin creator - ::ProposalLayer_TRT version 1
[TRT]    Registered plugin creator - ::PyramidROIAlign_TRT version 1
[TRT]    Registered plugin creator - ::ResizeNearest_TRT version 1
[TRT]    Registered plugin creator - ::Split version 1
[TRT]    Registered plugin creator - ::SpecialSlice_TRT version 1
[TRT]    Registered plugin creator - ::InstanceNormalization_TRT version 1
[TRT]    Registered plugin creator - ::InstanceNormalization_TRT version 2
[TRT]    Registered plugin creator - ::CoordConvAC version 1
[TRT]    Registered plugin creator - ::DecodeBbox3DPlugin version 1
[TRT]    Registered plugin creator - ::GenerateDetection_TRT version 1
[TRT]    Registered plugin creator - ::MultilevelCropAndResize_TRT version 1
[TRT]    Registered plugin creator - ::MultilevelProposeROI_TRT version 1
[TRT]    Registered plugin creator - ::NMSDynamic_TRT version 1
[TRT]    Registered plugin creator - ::PillarScatterPlugin version 1
[TRT]    Registered plugin creator - ::VoxelGeneratorPlugin version 1
[TRT]    Registered plugin creator - ::MultiscaleDeformableAttnPlugin_TRT version 1
[TRT]    detected model format - ONNX  (extension '.onnx')
[TRT]    desired precision specified for GPU: FASTEST
[TRT]    requested fasted precision for device GPU without providing valid calibrator, disabling INT8
[TRT]    [MemUsageChange] Init CUDA: CPU +181, GPU +0, now: CPU 225, GPU 4182 (MiB)
[TRT]    [MemUsageChange] Init builder kernel library: CPU +131, GPU +118, now: CPU 375, GPU 4319 (MiB)
[TRT]    native precisions detected for GPU:  FP32, FP16, INT8
[TRT]    selecting fastest native precision for GPU:  FP16
[TRT]    found engine cache file /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.1.1.8401.GPU.FP16.engine
[TRT]    found model checksum /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.sha256sum
[TRT]    echo "$(cat /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.sha256sum) /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx" | sha256sum --check --status
[TRT]    model matched checksum /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.sha256sum
[TRT]    loading network plan from engine cache... /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.1.1.8401.GPU.FP16.engine
[TRT]    device GPU, loaded /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx
[TRT]    [MemUsageChange] Init CUDA: CPU +0, GPU +0, now: CPU 286, GPU 4365 (MiB)
[TRT]    Loaded engine size: 40 MiB
[TRT]    Using an engine plan file across different models of devices is not recommended and is likely to affect performance or even cause errors.
[TRT]    Using cublasLt as a tactic source
[TRT]    [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +260, GPU +149, now: CPU 548, GPU 4514 (MiB)
[TRT]    Deserialization required 1391110 microseconds.
[TRT]    [MemUsageChange] TensorRT-managed allocation in engine deserialization: CPU +0, GPU +39, now: CPU 0, GPU 39 (MiB)
[TRT]    Using cublasLt as a tactic source
[TRT]    [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +0, now: CPU 548, GPU 4514 (MiB)
[TRT]    Total per-runner device persistent memory is 65536
[TRT]    Total per-runner host persistent memory is 71072
[TRT]    Allocated activation device memory of size 8028160
[TRT]    [MemUsageChange] TensorRT-managed allocation in IExecutionContext creation: CPU +0, GPU +8, now: CPU 0, GPU 47 (MiB)
[TRT]    The getMaxBatchSize() function should not be used with an engine built from a network created with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag. This function will always return 1.
[TRT]    
[TRT]    CUDA engine context initialized on device GPU:
[TRT]       -- layers       36
[TRT]       -- maxBatchSize 1
[TRT]       -- deviceMemory 8028160
[TRT]       -- bindings     3
[TRT]       binding 0
                -- index   0
                -- name    'input'
                -- type    FP32
                -- in/out  INPUT
                -- # dims  4
                -- dim #0  1
                -- dim #1  3
                -- dim #2  224
                -- dim #3  224
[TRT]       binding 1
                -- index   1
                -- name    'cmap'
                -- type    FP32
                -- in/out  OUTPUT
                -- # dims  4
                -- dim #0  1
                -- dim #1  18
                -- dim #2  56
                -- dim #3  56
[TRT]       binding 2
                -- index   2
                -- name    'paf'
                -- type    FP32
                -- in/out  OUTPUT
                -- # dims  4
                -- dim #0  1
                -- dim #1  42
                -- dim #2  56
                -- dim #3  56
[TRT]    
[TRT]    binding to input 0 input  binding index:  0
[TRT]    binding to input 0 input  dims (b=1 c=3 h=224 w=224) size=602112
[TRT]    binding to output 0 cmap  binding index:  1
[TRT]    binding to output 0 cmap  dims (b=1 c=18 h=56 w=56) size=225792
[TRT]    binding to output 1 paf  binding index:  2
[TRT]    binding to output 1 paf  dims (b=1 c=42 h=56 w=56) size=526848
[TRT]    
[TRT]    device GPU, /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx initialized.
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> v4l2src0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> v4l2src0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> v4l2src0
[gstreamer] gstreamer message stream-start ==> pipeline0
[gstreamer] gstCamera -- onPreroll
[gstreamer] gstDecoder -- failed to retrieve next image buffer
posenet: failed to capture next frame
[gstreamer] gstBufferManager recieve caps:  video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
[gstreamer] gstBufferManager -- recieved first frame, codec=raw format=nv12 width=1280 height=720 size=1382400
RingBuffer -- allocated 4 buffers (1382400 bytes each, 5529600 bytes total)
RingBuffer -- allocated 4 buffers (8 bytes each, 32 bytes total)
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
RingBuffer -- allocated 4 buffers (2764800 bytes each, 11059200 bytes total)
posenet: detected 0 person(s)
[OpenGL] creating 1280x720 texture (GL_RGB8 format, 2764800 bytes)
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/display/glTexture.cpp:360
[gstreamer] gstreamer message qos ==> v4l2src0
[gstreamer] gstreamer message qos ==> v4l2src0
[gstreamer] gstreamer message qos ==> v4l2src0

[TRT]    ------------------------------------------------
[TRT]    Timing Report /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx
[TRT]    ------------------------------------------------
[TRT]    Pre-Process   CPU   0.05059ms  CUDA   0.22499ms
[TRT]    Network       CPU  42.32537ms  CUDA  40.79744ms
[TRT]    Post-Process  CPU   1.05347ms  CUDA   0.63866ms
[TRT]    Visualize     CPU   0.01104ms  CUDA   0.01133ms
[TRT]    Total         CPU  43.44047ms  CUDA  41.67242ms
[TRT]    ------------------------------------------------

[TRT]    note -- when processing a single image, run 'sudo jetson_clocks' before
                to disable DVFS for more accurate profiling/timing measurements

[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaYUV-NV12.cu:154
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaColorspace.cpp:42
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/codec/gstBufferManager.cpp:435
[gstreamer] gstBufferManager -- unsupported image format (rgb8)
[gstreamer]                     supported formats are:
[gstreamer]                        * rgb8
[gstreamer]                        * rgba8
[gstreamer]                        * rgb32f
[gstreamer]                        * rgba32f
[gstreamer] gstDecoder -- failed to retrieve next image buffer
posenet: failed to capture next frame
[gstreamer] gstreamer message qos ==> v4l2src0
posenet: detected 0 person(s)
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/display/glTexture.cpp:360
[gstreamer] gstreamer message qos ==> v4l2src0
[gstreamer] gstreamer message qos ==> v4l2src0

[TRT]    ------------------------------------------------
[TRT]    Timing Report /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx
[TRT]    ------------------------------------------------
[TRT]    Pre-Process   CPU   0.03142ms  CUDA   0.12650ms
[TRT]    Network       CPU  26.00183ms  CUDA  24.32755ms
[TRT]    Post-Process  CPU   0.53050ms  CUDA   0.52458ms
[TRT]    Visualize     CPU   0.00662ms  CUDA   0.00688ms
[TRT]    Total         CPU  26.57038ms  CUDA  24.98550ms
[TRT]    ------------------------------------------------

If I run glxinfo | grep 'OpenGL renderer string', I get OpenGL renderer string: llvmpipe (LLVM 12.0.0, 128 bits)

Instead, if I run video-viewer to stream from the jetson I get this:

[gstreamer] initialized gstreamer, version 1.16.3.0
[gstreamer] gstDecoder -- creating decoder for ../Downloads/SampleVideo_1280x720_1mb.mp4
libEGL warning: DRI2: failed to authenticate
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
[gstreamer] gstDecoder -- discovered video resolution: 1280x720  (framerate 25.000000 Hz)
[gstreamer] gstDecoder -- discovered video caps:  video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3.1, profile=(string)main, width=(int)1280, height=(int)720, framerate=(fraction)25/1, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
[gstreamer] gstDecoder -- pipeline string:
[gstreamer] filesrc location=../Downloads/SampleVideo_1280x720_1mb.mp4 ! qtdemux ! queue ! h264parse ! nvv4l2decoder ! video/x-raw(memory:NVMM) ! nvvidconv ! video/x-raw ! appsink name=mysink
[video]  created gstDecoder from file:///home/mister/jetson-inference/../Downloads/SampleVideo_1280x720_1mb.mp4
------------------------------------------------
gstDecoder video options:
------------------------------------------------
  -- URI: file:///home/mister/jetson-inference/../Downloads/SampleVideo_1280x720_1mb.mp4
     - protocol:  file
     - location:  ../Downloads/SampleVideo_1280x720_1mb.mp4
     - extension: mp4
  -- deviceType: file
  -- ioType:     input
  -- codec:      h264
  -- width:      1280
  -- height:     720
  -- frameRate:  25.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------
[gstreamer] gstEncoder -- codec not specified, defaulting to H.264
[gstreamer] gstEncoder -- pipeline launch string:
[gstreamer] appsrc name=mysource is-live=true do-timestamp=true format=3 ! nvvidconv ! video/x-raw(memory:NVMM) ! nvv4l2h264enc bitrate=4000000 ! video/x-h264 !  rtph264pay config-interval=1 ! udpsink host=127.0.0.1 port=5000 auto-multicast=true
[video]  created gstEncoder from rtp://@:5000
------------------------------------------------
gstEncoder video options:
------------------------------------------------
  -- URI: rtp://@:5000
     - protocol:  rtp
     - location:  127.0.0.1
     - port:      5000
  -- deviceType: ip
  -- ioType:     output
  -- codec:      h264
  -- width:      0
  -- height:     0
  -- frameRate:  30.000000
  -- bitRate:    4000000
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------
[OpenGL] glDisplay -- X screen 0 resolution:  1920x1080
[OpenGL] glDisplay -- X window resolution:    1920x1080
[OpenGL] glDisplay -- display device initialized (1920x1080)
[video]  created glDisplay from display://0
------------------------------------------------
glDisplay video options:
------------------------------------------------
  -- URI: display://0
     - protocol:  display
     - location:  0
  -- deviceType: display
  -- ioType:     output
  -- codec:      raw
  -- width:      1920
  -- height:     1080
  -- frameRate:  0.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------
[gstreamer] opening gstDecoder for streaming, transitioning pipeline to GST_STATE_PLAYING
Opening in BLOCKING MODE 
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter2
[gstreamer] gstreamer changed state from NULL to READY ==> nvvconv0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> nvv4l2decoder1
[gstreamer] gstreamer changed state from NULL to READY ==> h264parse1
[gstreamer] gstreamer changed state from NULL to READY ==> queue0
[gstreamer] gstreamer changed state from NULL to READY ==> qtdemux1
[gstreamer] gstreamer changed state from NULL to READY ==> filesrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter2
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvvconv0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvv4l2decoder1
[gstreamer] gstreamer changed state from READY to PAUSED ==> h264parse1
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> queue0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer stream status CREATE ==> sink
[gstreamer] gstreamer changed state from READY to PAUSED ==> qtdemux1
[gstreamer] gstreamer changed state from READY to PAUSED ==> filesrc0
[gstreamer] gstreamer stream status ENTER ==> sink
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
[gstreamer] gstreamer message stream-start ==> pipeline0
[gstreamer] gstreamer message duration-changed ==> h264parse1
[gstreamer] gstDecoder -- onPreroll()
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ /\ AVC", bitrate=(uint)1198729;
[gstreamer] gstreamer mysink taglist, datetime=(datetime)1970-01-01T00:00:00Z, encoder=(string)Lavf53.24.2, container-format=(string)"ISO\ MP4/M4A";
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", bitrate=(uint)1198729;
[gstreamer] gstBufferManager -- map buffer size was less than max size (1382400 vs 1382407)
[gstreamer] gstBufferManager recieve caps:  video/x-raw, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, format=(string)NV12
[gstreamer] gstBufferManager -- recieved first frame, codec=h264 format=nv12 width=1280 height=720 size=1382407
RingBuffer -- allocated 4 buffers (1382407 bytes each, 5529628 bytes total)
RingBuffer -- allocated 4 buffers (8 bytes each, 32 bytes total)
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvvconv0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvv4l2decoder1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> h264parse1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> queue0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> qtdemux1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> filesrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
RingBuffer -- allocated 4 buffers (2764800 bytes each, 11059200 bytes total)
video-viewer:  captured 1 frames (1280 x 720)
[OpenGL] creating 1280x720 texture (GL_RGB8 format, 2764800 bytes)
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/display/glTexture.cpp:360
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", bitrate=(uint)1198729, minimum-bitrate=(uint)793200, maximum-bitrate=(uint)793200;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", bitrate=(uint)1198729, minimum-bitrate=(uint)793200, maximum-bitrate=(uint)816200;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", bitrate=(uint)1198729, minimum-bitrate=(uint)793200, maximum-bitrate=(uint)926200;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", bitrate=(uint)1198729, minimum-bitrate=(uint)793200, maximum-bitrate=(uint)999200;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", bitrate=(uint)1198729, minimum-bitrate=(uint)793200, maximum-bitrate=(uint)1181800;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", bitrate=(uint)1198729, minimum-bitrate=(uint)793200, maximum-bitrate=(uint)1199000;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", bitrate=(uint)1198729, minimum-bitrate=(uint)793200, maximum-bitrate=(uint)1283800;
RingBuffer -- allocated 2 buffers (1382400 bytes each, 2764800 bytes total)
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaYUV-YV12.cu:257
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaColorspace.cpp:128
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/codec/gstEncoder.cpp:562
[gstreamer] gstEncoder::Render() -- unsupported image format (rgb8)
[gstreamer]                         supported formats are:
[gstreamer]                             * rgb8
[gstreamer]                             * rgba8
[gstreamer]                             * rgb32f
[gstreamer]                             * rgba32f
video-viewer:  captured 2 frames (1280 x 720)
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/display/glTexture.cpp:360
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", bitrate=(uint)1198729, minimum-bitrate=(uint)793200, maximum-bitrate=(uint)1352400;
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaYUV-YV12.cu:257
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaColorspace.cpp:128
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/codec/gstEncoder.cpp:562
[gstreamer] gstEncoder::Render() -- unsupported image format (rgb8)
[gstreamer]                         supported formats are:
[gstreamer]                             * rgb8
[gstreamer]                             * rgba8
[gstreamer]                             * rgb32f
[gstreamer]                             * rgba32f

OK yes, the errors you are getting are related to CUDA<->OpenGL interoperability, and it seems like the NVIDIA OpenGL driver has been replaced with another vendor’s. If you are to restore it, it should start working again. Or you can continue running it headlessly and watch the network stream remotely.

When I have the HDMI monitor connected, it uses the NVIDIA driver. When I unplug the HDMI and use TeamViewer, it switches to that llvmpipe.
Also, I cannot watch the stream remotely: when I run posenet /dev/video0 rtsp://@:8554/stream and try to open the stream from another device, I always get a “Connection refused” error

OK yes, the CUDA<->OpenGL interoperability won’t work with X11 forwarding or llvmpipe. You can also try the webrtc output from jetson-inference/jetson-utils and see if you can connect that way. For RTP/RTSP, I have better luck using GStreamer on the client-side for viewing than VLC player (ffmpeg)

Hi, even with webrtc I only get this error:

[gstreamer] gstBufferManager -- unsupported image format (rgb8)
[gstreamer]                     supported formats are:
[gstreamer]                        * rgb8
[gstreamer]                        * rgba8
[gstreamer]                        * rgb32f
[gstreamer]                        * rgba32f
[gstreamer] gstDecoder -- failed to retrieve next image buffer
posenet: failed to capture next frame

I tried to use both Gstreamer and ffmpeg, but none works

@defalco sorry I meant WebRTC output, not input - are you able to run posenet /dev/video0 webrtc://@:8554/stream ?

Then navigate your browser (chrome) to http://YOUR_JETSON-IP:8554 (you may need to disable chrome://flags#enable-webrtc-hide-local-ips-with-mdns first)

If you want to output RTP/RTSP from the Jetson, start with RTP first and use GStreamer on the other end first to check that you can view it (like here).

If you keep getting errors, please post the whole log so I can check it again - good luck!

Yeah I understood you meant output…sorry, I explained myself bad. If I run posenet /dev/video0 webrtc://@:8554/stream I get that error on the Jetson (even though the TensorRT window gets opened, but completely black). On the other pc, when I try to navigate to http://JETSON-IP:8554/stream I get a page with “Connection refused” error.

If I try with RTP, I get the same error on the jetson and on the Gstreamer end nothing happens…this mesage is printed:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
New clock: GstSystemClock
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96

and it stays there undefinetly.

Complete log of the error on the jetson is the folowing:

[gstreamer] initialized gstreamer, version 1.16.3.0
[gstreamer] gstCamera -- attempting to create device v4l2:///dev/video0
[gstreamer] gstCamera -- found v4l2 device: Logitech BRIO
[gstreamer] v4l2-proplist, device.path=(string)/dev/video2, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)"Logitech\ BRIO", v4l2.device.bus_info=(string)usb-3610000.xhci-3, v4l2.device.version=(uint)330344, v4l2.device.capabilities=(uint)2225078273, v4l2.device.device_caps=(uint)69206017;
[gstreamer] gstCamera -- found v4l2 device: Logitech BRIO
[gstreamer] v4l2-proplist, device.path=(string)/dev/video0, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)"Logitech\ BRIO", v4l2.device.bus_info=(string)usb-3610000.xhci-3, v4l2.device.version=(uint)330344, v4l2.device.capabilities=(uint)2225078273, v4l2.device.device_caps=(uint)69206017;
[gstreamer] gstCamera -- found 43 caps for v4l2 device /dev/video0
[gstreamer] [0] video/x-raw, format=(string)YUY2, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [1] video/x-raw, format=(string)YUY2, width=(int)1600, height=(int)896, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [2] video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [3] video/x-raw, format=(string)YUY2, width=(int)1024, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [4] video/x-raw, format=(string)YUY2, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [5] video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [6] video/x-raw, format=(string)YUY2, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [7] video/x-raw, format=(string)YUY2, width=(int)800, height=(int)448, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [8] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [9] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [10] video/x-raw, format=(string)YUY2, width=(int)440, height=(int)440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
[gstreamer] [11] video/x-raw, format=(string)YUY2, width=(int)480, height=(int)270, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [12] video/x-raw, format=(string)YUY2, width=(int)340, height=(int)340, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
[gstreamer] [13] video/x-raw, format=(string)YUY2, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [14] video/x-raw, format=(string)YUY2, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [15] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [16] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [17] video/x-raw, format=(string)YUY2, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [18] video/x-raw, format=(string)YUY2, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [19] image/jpeg, width=(int)4096, height=(int)2160, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [20] image/jpeg, width=(int)3840, height=(int)2160, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [21] image/jpeg, width=(int)2560, height=(int)1440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [22] image/jpeg, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [23] image/jpeg, width=(int)1600, height=(int)896, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [24] image/jpeg, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 90/1, 60/1, 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [25] image/jpeg, width=(int)1024, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [26] image/jpeg, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [27] image/jpeg, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [28] image/jpeg, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [29] image/jpeg, width=(int)800, height=(int)448, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [30] image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 120/1, 90/1, 60/1, 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [31] image/jpeg, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [32] image/jpeg, width=(int)480, height=(int)270, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [33] image/jpeg, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [34] image/jpeg, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [35] image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [36] image/jpeg, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [37] image/jpeg, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [38] image/jpeg, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [39] video/x-raw, format=(string)NV12, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [40] video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [41] video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [42] video/x-raw, format=(string)NV12, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] gstCamera -- selected device profile:  codec=raw format=nv12 width=1280 height=720
[gstreamer] gstCamera pipeline string:
[gstreamer] v4l2src device=/dev/video0 do-timestamp=true ! video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720 ! appsink name=mysink
[gstreamer] gstCamera successfully created device v4l2:///dev/video0
[video]  created gstCamera from v4l2:///dev/video0
------------------------------------------------
gstCamera video options:
------------------------------------------------
  -- URI: v4l2:///dev/video0
     - protocol:  v4l2
     - location:  /dev/video0
  -- deviceType: v4l2
  -- ioType:     input
  -- codec:      raw
  -- width:      1280
  -- height:     720
  -- frameRate:  30.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------
[gstreamer] gstEncoder -- codec not specified, defaulting to H.264
[gstreamer] gstEncoder -- pipeline launch string:
[gstreamer] appsrc name=mysource is-live=true do-timestamp=true format=3 ! nvvidconv ! video/x-raw(memory:NVMM) ! nvv4l2h264enc bitrate=4000000 ! video/x-h264 !  rtph264pay config-interval=1 ! udpsink host=127.0.0.1 port=8554 auto-multicast=true
libEGL warning: DRI2: failed to authenticate
[video]  created gstEncoder from rtp://@:8554/stream
------------------------------------------------
gstEncoder video options:
------------------------------------------------
  -- URI: rtp://@:8554/stream
     - protocol:  rtp
     - location:  127.0.0.1
     - port:      8554
  -- deviceType: ip
  -- ioType:     output
  -- codec:      h264
  -- width:      0
  -- height:     0
  -- frameRate:  30.000000
  -- bitRate:    4000000
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------
[OpenGL] glDisplay -- X screen 0 resolution:  1920x1080
[OpenGL] glDisplay -- X window resolution:    1920x1080
[OpenGL] glDisplay -- display device initialized (1920x1080)
[video]  created glDisplay from display://0
------------------------------------------------
glDisplay video options:
------------------------------------------------
  -- URI: display://0
     - protocol:  display
     - location:  0
  -- deviceType: display
  -- ioType:     output
  -- codec:      raw
  -- width:      1920
  -- height:     1080
  -- frameRate:  0.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------

poseNet -- loading pose estimation model from:
        -- model        networks/Pose-ResNet18-Body/pose_resnet18_body.onnx
        -- topology     networks/Pose-ResNet18-Body/human_pose.json
        -- colors       networks/Pose-ResNet18-Body/colors.txt
        -- input_blob   'input'
        -- output_cmap  'cmap'
        -- output_paf   'paf'
        -- threshold    0.150000
        -- batch_size   1

[TRT]    topology -- keypoint 0  nose
[TRT]    topology -- keypoint 1  left_eye
[TRT]    topology -- keypoint 2  right_eye
[TRT]    topology -- keypoint 3  left_ear
[TRT]    topology -- keypoint 4  right_ear
[TRT]    topology -- keypoint 5  left_shoulder
[TRT]    topology -- keypoint 6  right_shoulder
[TRT]    topology -- keypoint 7  left_elbow
[TRT]    topology -- keypoint 8  right_elbow
[TRT]    topology -- keypoint 9  left_wrist
[TRT]    topology -- keypoint 10  right_wrist
[TRT]    topology -- keypoint 11  left_hip
[TRT]    topology -- keypoint 12  right_hip
[TRT]    topology -- keypoint 13  left_knee
[TRT]    topology -- keypoint 14  right_knee
[TRT]    topology -- keypoint 15  left_ankle
[TRT]    topology -- keypoint 16  right_ankle
[TRT]    topology -- keypoint 17  neck
[TRT]    topology -- skeleton link 0  16 14
[TRT]    topology -- skeleton link 1  14 12
[TRT]    topology -- skeleton link 2  17 15
[TRT]    topology -- skeleton link 3  15 13
[TRT]    topology -- skeleton link 4  12 13
[TRT]    topology -- skeleton link 5  6 8
[TRT]    topology -- skeleton link 6  7 9
[TRT]    topology -- skeleton link 7  8 10
[TRT]    topology -- skeleton link 8  9 11
[TRT]    topology -- skeleton link 9  2 3
[TRT]    topology -- skeleton link 10  1 2
[TRT]    topology -- skeleton link 11  1 3
[TRT]    topology -- skeleton link 12  2 4
[TRT]    topology -- skeleton link 13  3 5
[TRT]    topology -- skeleton link 14  4 6
[TRT]    topology -- skeleton link 15  5 7
[TRT]    topology -- skeleton link 16  18 1
[TRT]    topology -- skeleton link 17  18 6
[TRT]    topology -- skeleton link 18  18 7
[TRT]    topology -- skeleton link 19  18 12
[TRT]    topology -- skeleton link 20  18 13
[TRT]    poseNet -- keypoint 00 'nose'  color 255 0 85 255
[TRT]    poseNet -- keypoint 01 'left_eye'  color 255 0 0 255
[TRT]    poseNet -- keypoint 02 'right_eye'  color 255 85 0 255
[TRT]    poseNet -- keypoint 03 'left_ear'  color 255 170 0 255
[TRT]    poseNet -- keypoint 04 'right_ear'  color 255 255 0 255
[TRT]    poseNet -- keypoint 05 'left_shoulder'  color 170 255 0 255
[TRT]    poseNet -- keypoint 06 'right_shoulder'  color 85 255 0 255
[TRT]    poseNet -- keypoint 07 'left_elbow'  color 0 255 0 255
[TRT]    poseNet -- keypoint 08 'right_elbow'  color 0 255 85 255
[TRT]    poseNet -- keypoint 09 'left_wrist'  color 0 255 170 255
[TRT]    poseNet -- keypoint 10 'right_wrist'  color 0 255 255 255
[TRT]    poseNet -- keypoint 11 'left_hip'  color 0 170 255 255
[TRT]    poseNet -- keypoint 12 'right_hip'  color 0 85 255 255
[TRT]    poseNet -- keypoint 13 'left_knee'  color 0 0 255 255
[TRT]    poseNet -- keypoint 14 'right_knee'  color 255 0 170 255
[TRT]    poseNet -- keypoint 15 'left_ankle'  color 170 0 255 255
[TRT]    poseNet -- keypoint 16 'right_ankle'  color 255 0 255 255
[TRT]    poseNet -- keypoint 17 'neck'  color 85 0 255 255
[TRT]    poseNet -- loaded 18 class colors
[TRT]    TensorRT version 8.4.1
[TRT]    loading NVIDIA plugins...
[TRT]    Registered plugin creator - ::GridAnchor_TRT version 1
[TRT]    Registered plugin creator - ::GridAnchorRect_TRT version 1
[TRT]    Registered plugin creator - ::NMS_TRT version 1
[TRT]    Registered plugin creator - ::Reorg_TRT version 1
[TRT]    Registered plugin creator - ::Region_TRT version 1
[TRT]    Registered plugin creator - ::Clip_TRT version 1
[TRT]    Registered plugin creator - ::LReLU_TRT version 1
[TRT]    Registered plugin creator - ::PriorBox_TRT version 1
[TRT]    Registered plugin creator - ::Normalize_TRT version 1
[TRT]    Registered plugin creator - ::ScatterND version 1
[TRT]    Registered plugin creator - ::RPROI_TRT version 1
[TRT]    Registered plugin creator - ::BatchedNMS_TRT version 1
[TRT]    Registered plugin creator - ::BatchedNMSDynamic_TRT version 1
[TRT]    Registered plugin creator - ::BatchTilePlugin_TRT version 1
[TRT]    Could not register plugin creator -  ::FlattenConcat_TRT version 1
[TRT]    Registered plugin creator - ::CropAndResize version 1
[TRT]    Registered plugin creator - ::CropAndResizeDynamic version 1
[TRT]    Registered plugin creator - ::DetectionLayer_TRT version 1
[TRT]    Registered plugin creator - ::EfficientNMS_TRT version 1
[TRT]    Registered plugin creator - ::EfficientNMS_ONNX_TRT version 1
[TRT]    Registered plugin creator - ::EfficientNMS_Explicit_TF_TRT version 1
[TRT]    Registered plugin creator - ::EfficientNMS_Implicit_TF_TRT version 1
[TRT]    Registered plugin creator - ::ProposalDynamic version 1
[TRT]    Registered plugin creator - ::Proposal version 1
[TRT]    Registered plugin creator - ::ProposalLayer_TRT version 1
[TRT]    Registered plugin creator - ::PyramidROIAlign_TRT version 1
[TRT]    Registered plugin creator - ::ResizeNearest_TRT version 1
[TRT]    Registered plugin creator - ::Split version 1
[TRT]    Registered plugin creator - ::SpecialSlice_TRT version 1
[TRT]    Registered plugin creator - ::InstanceNormalization_TRT version 1
[TRT]    Registered plugin creator - ::InstanceNormalization_TRT version 2
[TRT]    Registered plugin creator - ::CoordConvAC version 1
[TRT]    Registered plugin creator - ::DecodeBbox3DPlugin version 1
[TRT]    Registered plugin creator - ::GenerateDetection_TRT version 1
[TRT]    Registered plugin creator - ::MultilevelCropAndResize_TRT version 1
[TRT]    Registered plugin creator - ::MultilevelProposeROI_TRT version 1
[TRT]    Registered plugin creator - ::NMSDynamic_TRT version 1
[TRT]    Registered plugin creator - ::PillarScatterPlugin version 1
[TRT]    Registered plugin creator - ::VoxelGeneratorPlugin version 1
[TRT]    Registered plugin creator - ::MultiscaleDeformableAttnPlugin_TRT version 1
[TRT]    detected model format - ONNX  (extension '.onnx')
[TRT]    desired precision specified for GPU: FASTEST
[TRT]    requested fasted precision for device GPU without providing valid calibrator, disabling INT8
[TRT]    [MemUsageChange] Init CUDA: CPU +182, GPU +0, now: CPU 227, GPU 4636 (MiB)
[TRT]    [MemUsageChange] Init builder kernel library: CPU +131, GPU +121, now: CPU 376, GPU 4777 (MiB)
[TRT]    native precisions detected for GPU:  FP32, FP16, INT8
[TRT]    selecting fastest native precision for GPU:  FP16
[TRT]    found engine cache file /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.1.1.8401.GPU.FP16.engine
[TRT]    found model checksum /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.sha256sum
[TRT]    echo "$(cat /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.sha256sum) /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx" | sha256sum --check --status
[TRT]    model matched checksum /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.sha256sum
[TRT]    loading network plan from engine cache... /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx.1.1.8401.GPU.FP16.engine
[TRT]    device GPU, loaded /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx
[TRT]    [MemUsageChange] Init CUDA: CPU +0, GPU +0, now: CPU 287, GPU 4817 (MiB)
[TRT]    Loaded engine size: 40 MiB
[TRT]    Using an engine plan file across different models of devices is not recommended and is likely to affect performance or even cause errors.
[TRT]    Using cublasLt as a tactic source
[TRT]    [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +260, GPU +145, now: CPU 549, GPU 4962 (MiB)
[TRT]    Deserialization required 1264179 microseconds.
[TRT]    [MemUsageChange] TensorRT-managed allocation in engine deserialization: CPU +0, GPU +39, now: CPU 0, GPU 39 (MiB)
[TRT]    Using cublasLt as a tactic source
[TRT]    [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +0, now: CPU 549, GPU 4962 (MiB)
[TRT]    Total per-runner device persistent memory is 65536
[TRT]    Total per-runner host persistent memory is 71072
[TRT]    Allocated activation device memory of size 8028160
[TRT]    [MemUsageChange] TensorRT-managed allocation in IExecutionContext creation: CPU +0, GPU +8, now: CPU 0, GPU 47 (MiB)
[TRT]    The getMaxBatchSize() function should not be used with an engine built from a network created with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag. This function will always return 1.
[TRT]    
[TRT]    CUDA engine context initialized on device GPU:
[TRT]       -- layers       36
[TRT]       -- maxBatchSize 1
[TRT]       -- deviceMemory 8028160
[TRT]       -- bindings     3
[TRT]       binding 0
                -- index   0
                -- name    'input'
                -- type    FP32
                -- in/out  INPUT
                -- # dims  4
                -- dim #0  1
                -- dim #1  3
                -- dim #2  224
                -- dim #3  224
[TRT]       binding 1
                -- index   1
                -- name    'cmap'
                -- type    FP32
                -- in/out  OUTPUT
                -- # dims  4
                -- dim #0  1
                -- dim #1  18
                -- dim #2  56
                -- dim #3  56
[TRT]       binding 2
                -- index   2
                -- name    'paf'
                -- type    FP32
                -- in/out  OUTPUT
                -- # dims  4
                -- dim #0  1
                -- dim #1  42
                -- dim #2  56
                -- dim #3  56
[TRT]    
[TRT]    binding to input 0 input  binding index:  0
[TRT]    binding to input 0 input  dims (b=1 c=3 h=224 w=224) size=602112
[TRT]    binding to output 0 cmap  binding index:  1
[TRT]    binding to output 0 cmap  dims (b=1 c=18 h=56 w=56) size=225792
[TRT]    binding to output 1 paf  binding index:  2
[TRT]    binding to output 1 paf  dims (b=1 c=42 h=56 w=56) size=526848
[TRT]    
[TRT]    device GPU, /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx initialized.
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> v4l2src0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> v4l2src0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> v4l2src0
[gstreamer] gstreamer message stream-start ==> pipeline0
[gstreamer] gstCamera -- onPreroll
[gstreamer] gstDecoder -- failed to retrieve next image buffer
posenet: failed to capture next frame
[gstreamer] gstBufferManager recieve caps:  video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
[gstreamer] gstBufferManager -- recieved first frame, codec=raw format=nv12 width=1280 height=720 size=1382400
RingBuffer -- allocated 4 buffers (1382400 bytes each, 5529600 bytes total)
RingBuffer -- allocated 4 buffers (8 bytes each, 32 bytes total)
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
RingBuffer -- allocated 4 buffers (2764800 bytes each, 11059200 bytes total)
posenet: detected 0 person(s)
[OpenGL] creating 1280x720 texture (GL_RGB8 format, 2764800 bytes)
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/display/glTexture.cpp:360
[gstreamer] gstreamer message qos ==> v4l2src0
RingBuffer -- allocated 2 buffers (1382400 bytes each, 2764800 bytes total)
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaYUV-YV12.cu:257
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaColorspace.cpp:128
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/codec/gstEncoder.cpp:562
[gstreamer] gstEncoder::Render() -- unsupported image format (rgb8)
[gstreamer]                         supported formats are:
[gstreamer]                             * rgb8
[gstreamer]                             * rgba8
[gstreamer]                             * rgb32f
[gstreamer]                             * rgba32f

[TRT]    ------------------------------------------------
[TRT]    Timing Report /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx
[TRT]    ------------------------------------------------
[TRT]    Pre-Process   CPU   0.04797ms  CUDA   0.22211ms
[TRT]    Network       CPU  37.83006ms  CUDA  34.99123ms
[TRT]    Post-Process  CPU   0.34833ms  CUDA   0.34765ms
[TRT]    Visualize     CPU   0.09383ms  CUDA   0.01021ms
[TRT]    Total         CPU  38.32018ms  CUDA  35.57120ms
[TRT]    ------------------------------------------------

[TRT]    note -- when processing a single image, run 'sudo jetson_clocks' before
                to disable DVFS for more accurate profiling/timing measurements

[gstreamer] gstreamer message qos ==> v4l2src0
posenet: detected 0 person(s)
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/display/glTexture.cpp:360
[gstreamer] gstreamer message qos ==> v4l2src0
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaYUV-YV12.cu:257
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/cuda/cudaColorspace.cpp:128
[cuda]      unknown error (error 999) (hex 0x3E7)
[cuda]      /home/mister/jetson-inference/utils/codec/gstEncoder.cpp:562
[gstreamer] gstEncoder::Render() -- unsupported image format (rgb8)
[gstreamer]                         supported formats are:
[gstreamer]                             * rgb8
[gstreamer]                             * rgba8
[gstreamer]                             * rgb32f
[gstreamer]                             * rgba32f

[TRT]    ------------------------------------------------
[TRT]    Timing Report /usr/local/bin/networks/Pose-ResNet18-Body/pose_resnet18_body.onnx
[TRT]    ------------------------------------------------
[TRT]    Pre-Process   CPU   0.03098ms  CUDA   0.12259ms
[TRT]    Network       CPU  25.67364ms  CUDA  24.27651ms
[TRT]    Post-Process  CPU   0.28417ms  CUDA   0.28349ms
[TRT]    Visualize     CPU   0.00589ms  CUDA   0.00602ms
[TRT]    Total         CPU  25.99467ms  CUDA  24.68861ms
[TRT]    ------------------------------------------------

Sorry for the delay @defalco - it looks like what’s happening, is that it is still trying to create the OpenGL display (because X11-forwarding is enabled over SSH, or using TeamViewer, ect), and creating CUDA error when it tries to do OpenGL interop (which fails when the OpenGL driver is not NVIDIA driver) - and then that CUDA error propagates and prevents the encoder from properly outputting the RTP stream.

Can you try running posenet with the --headless flag? It should then not attempt to create OpenGL window, and you should no longer see messages like this about creating glDisplay or the CUDA errors:

[OpenGL] glDisplay -- X screen 0 resolution:  1920x1080
[OpenGL] glDisplay -- X window resolution:    1920x1080
[OpenGL] glDisplay -- display device initialized (1920x1080)
[video]  created glDisplay from display://0
------------------------------------------------
glDisplay video options:
------------------------------------------------
  -- URI: display://0
     - protocol:  display
     - location:  0
  -- deviceType: display
  -- ioType:     output
  -- codec:      raw
  -- width:      1920
  -- height:     1080
  -- frameRate:  0.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------

Hi,
Yes, I am not getting those errors anymore!
I still get the “Connection refused” errors…but I don’t think they are related to jetson-inference, right?

OK gotcha - no, they would be related to the networking connection or the video client unable to connect to the stream. It is normal for the stream to take a few seconds to negotate/sync when connecting, but it should eventually start playback. My experience using VLC from Windows to receive RTP/RTSP is hit or miss (or by extension, ffmpeg I suppose). It tends to work reliably with GStreamer on the other end though (or various NVR recording tools that I’ve tried)

If all you want to do is remotely view the stream, my goto is using WebRTC - that seems pretty solid using chrome and setting chrome://flags#enable-webrtc-hide-local-ips-with-mdns to disabled. You would just start posenet like posenet /dev/video0 webrtc://@:8554/output and then navigate your browser to http://JETSON_IP:8554 (or https if you generated SSL certs)

Thank you for all your suggestions and you patience!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.