Help with connecting Jetson to Carla simulator with Gstreamer

Hello friends,

I’m not very good at gstreamer and I maybe doing something really newbie.
I’m trying to send the camera images from a carla simulator to my Jetson Tx2 to do some image segmentation.

On the PC side, I’m connecting to the carla simulator using a python script that basically launches a pipeline with:

appsrc name=source is-live=true format=GST_FORMAT_TIME caps=video/x-raw,format=BGR,clock-rate=(int)90000,width=640,height=480,framerate=30/1 ! videoconvert ! videorate ! video/x-raw ! x264enc ! rtph264pay ! udpsink host=[Jetson's IP] port=18000 sync=false

Then, I get the appsrc pipeline element from its name and I use this to write frames to it:

def new_frame(self, frame) -> None:
       data = frame.tostring()
       buffer = Gst.Buffer.new_allocate(None, len(data), None)
       buffer.fill(0, data)
       buffer.duration = self.__duration
       ret = self.__appsrc.emit('push-buffer', buffer)
       if ret != Gst.FlowReturn.OK:
              print(ret)

this method is called by the carla’s API callback whenever a frame is available. The frame is a numpy [640, 480, 3] array.

Well, on PC simulating a reader for this code I was able to successfully see the video output, by using the suggested pipeline on videoSource source code

gst-launch-1.0 -v udpsrc port=18000 caps = “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96” ! rtph264depay ! decodebin ! videoconvert ! autovideosink

On Jetson:

the same pipeline above reports unsupported codec
video-viewer reports received frames…
C++ code also wont work, it reports invalid frames
for the C++ code i’m instantiating a videoSource object in a pointer and I’m using videoOptions to
config it with

options = new videoOptions();
options->ioType = videoOptions::INPUT;
options->zeroCopy = true;
options->resource = "rtp://[IP]:18000"
options->width = 640;
options->height = 480;
options->codec = videoOptions::CODEC_H264;
options->frameRate = 30.0;

Then I do a

input = videoSource::Create(options);

followed by a input->Open()

And I basically follow the example code to get frames

uchar3 *imgptr = NULL;
input->Capture(&imgptr, timeout)

I’ve also checked the stream for IsStreaming()

But, when I try to get then in C++ code I get [gstreamer] gstDecoder – end of stream (EOS) has been reached, stream has been closed

The same code runs smoothly for Camera input and video input from my dataset (for camera I’m using gstCamera::Create(options), for video input i’m using the same videoSource::Create(options))

What I’m doing wrong? Any tips?

**edited to correct identation/spelling

Hi @cristianoo, to confirm/debug that your Jetson is able to receive the RTP stream, can you first try running video-viewer --input-codec=h264 rtp://@:18000 ?

If that’s unable to view the stream, then I would try running this test pipeline on your PC to generate the stream:

gst-launch-1.0 -v videotestsrc ! video/x-raw,width=300,height=300,framerate=30/1 ! x264enc ! rtph264pay ! udpsink host=[JETSON-IP] port=18000

And if that doesn’t work, it would indicate some networking issue between your PC and Jetson, so check the ifconfig tx/rx packets are increasing on both sides and that there isn’t some firewall/ect blocking the traffic.

Hi !

Ok, so I’ve done the following tests:

  1. running gst-launch-1.0 -v videotestsrc ! video/x-raw,width=300,height=300,framerate=30/1 ! x264enc ! rtph264pay ! udpsink host=[JETSON-IP] port=18000 on the PC and video-viewer --input-codec=h264 on Jetson
    SUCCESS

  2. running gst-launch-1.0 -v videotestsrc ! video/x-raw,width=300,height=300,framerate=30/1 ! x264enc ! rtph264pay ! udpsink host=[JETSON-IP] port=18000 on the PC and my C++ jetson-inference code on the Jetson
    SUCCESS

  3. running my python code on PC and video-viewer --input-codec=h264 on Jetson
    FAIL

[gstreamer] gstDecoder – end of stream (EOS)
NVMEDIA: NVMEDIABufferProcessing: 1504: NvMediaParserParse Unsupported Codec
NVMEDIA: NVMEDIABufferProcessing: 1504: NvMediaParserParse Unsupported Codec
Event_BlockError from 0BlockH264Dec : Error code - e3040
Blocking error event from 0BlockH264DecNVMEDIA: NvMMLiteNVMEDIADecDoWork: 1982: NVMEDIA Video Dec Unsupported Stream

The same thing that happend with my C++ code.

Well I’m already happy with this because now I know it’s my python code that has issues. Thank you for your help already!

Do you guys have any example / experience on sending video frames as np.arrays through Gstreamer?

Hi @cristianoo, sorry for the delay - I personally don’t, however you may find this post useful:

Thank you for your support.

While I was struggling with my code (which for any reasons can stream to a PC but not to Jetson), I’ve found a possible problem.

Please check:

In a PC (PopOS ubuntu-based, 22.04 gstreamer installed), if I do a
gst-launch-1.0 videotestsrc is-live=true ! videoconvert ! queue ! video/x-raw, width=1920, height=1080 ! openh264enc ! rtph264pay ! udpsink host=[jetson’s IP] port=5000

Then read it on Jetson Tx2 with:
video-viewer --input-codec=h264 rtp://[jetson’s IP]:5000

works like a charm.

But if I change openh264enc to x264enc, video-viewer crashes with a NVMEDIA Video Dec Unsupported Stream
[gstreamer] gstDecoder – end of stream (EOS)

In my workstation I’ve tested the following:

nvcodec: nvh264enc: NVENC H.264 Video Encoder <— crashes video-viewer
openh264: openh264enc: OpenH264 video encoder <— works
x264: x264enc: x264 H.264 Encoder <— crashes video-viewer

Is there any difference? shouldnt all of them work, since its the same protocol?

Also, I’ve found that it works well for x265enc, but not for x264enc

edit: I’ve managed to use VP8 to connect a Carla simulator camera output to the Jetson using the example you provided @dusty_nv, so cool!

Nevertheless, using 264 encoding is really not possible due to Jetson not recognizing the stream.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.