Hello friends,
I’m not very good at gstreamer and I maybe doing something really newbie.
I’m trying to send the camera images from a carla simulator to my Jetson Tx2 to do some image segmentation.
On the PC side, I’m connecting to the carla simulator using a python script that basically launches a pipeline with:
appsrc name=source is-live=true format=GST_FORMAT_TIME
caps=video/x-raw,format=BGR,clock-rate=(int)90000,width=640,height=480,framerate=30/1
! videoconvert ! videorate ! video/x-raw ! x264enc ! rtph264pay ! udpsink host=[Jetson's IP] port=18000 sync=false
Then, I get the appsrc pipeline element from its name and I use this to write frames to it:
def new_frame(self, frame) -> None:
data = frame.tostring()
buffer = Gst.Buffer.new_allocate(None, len(data), None)
buffer.fill(0, data)
buffer.duration = self.__duration
ret = self.__appsrc.emit('push-buffer', buffer)
if ret != Gst.FlowReturn.OK:
print(ret)
this method is called by the carla’s API callback whenever a frame is available. The frame is a numpy [640, 480, 3] array.
Well, on PC simulating a reader for this code I was able to successfully see the video output, by using the suggested pipeline on videoSource source code
gst-launch-1.0 -v udpsrc port=18000 caps = “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96” ! rtph264depay ! decodebin ! videoconvert ! autovideosink
On Jetson:
the same pipeline above reports unsupported codec
video-viewer reports received frames…
C++ code also wont work, it reports invalid frames
for the C++ code i’m instantiating a videoSource object in a pointer and I’m using videoOptions to
config it with
options = new videoOptions();
options->ioType = videoOptions::INPUT;
options->zeroCopy = true;
options->resource = "rtp://[IP]:18000"
options->width = 640;
options->height = 480;
options->codec = videoOptions::CODEC_H264;
options->frameRate = 30.0;
Then I do a
input = videoSource::Create(options);
followed by a input->Open()
And I basically follow the example code to get frames
uchar3 *imgptr = NULL;
input->Capture(&imgptr, timeout)
I’ve also checked the stream for IsStreaming()
But, when I try to get then in C++ code I get [gstreamer] gstDecoder – end of stream (EOS) has been reached, stream has been closed
The same code runs smoothly for Camera input and video input from my dataset (for camera I’m using gstCamera::Create(options), for video input i’m using the same videoSource::Create(options))
What I’m doing wrong? Any tips?
**edited to correct identation/spelling