I’m attempting to marry jetson_multimedia_api/samples/13_multi_camera with gstreamer in order to provide a udp video stream of the composite video from the multiple cameras connected to my Xavier NX. I am on Jetpack 5.0.2 because the Seeed A205 carrier board is not provided drivers for anything later than that. I am attempting to follow the example at this link: Creating a GStreamer source that publishes to NVMM.
- I have built and run both 13_multi_camera and appsrc_nvbufsurface successfully.
- I have modified 13_multi_camera such that it runs perpetually.
- I have added a “StreamThread” class to 13_multi_camera that successfully opens a gstreamer pipeline given from command line elements on a secondary thread while the original consumer and producer continue to composite and display captured frames from my cameras. I tested this by using the following pipeline string on the commmand:
"videotestsrc ! video/x-raw,width=1920,height=1080 ! videoconvert ! autovideosink"
- The above displayed video test source in a window alongside the rendered output from 13_multi_camera.
- I have passed a reference to the instantiated StreamThread object to the ConsumerThread object at instantiation.
- In ConsumerThread::threadExecute() I am then passing the file descriptor m_compositedFrame to the given StreamThread object’s pushFrame function as a replacement for the render call:
CONSUMER_PRINT("Render frame %d\n", g_frame_count - m_framesRemaining);
if (m_streams.size() > 1)
{
/* Composite multiple input to one frame */
NvBufSurfTransformMultiInputBufCompositeBlend(batch_surf, m_pdstSurf, &m_compositeParam);
//g_renderer->render(m_compositedFrame);
m_streamer->pushFrame(m_compositedFrame);
}
- StreamThread::pushFrame() is intended to take the image data and push it on to a gstreamer appsrc. I have tested the components/modifications that come before this part, and they all at least seem to work. However, the program fails to display the surfaces via gstreamer in the way I’d hoped it would.
inline void StreamThread::pushFrame(int surfaceID)
{
printf("Pushing frame to gstreamer pipeline at time stamp: %ld\n", mTimestamp);
NvBufSurface *surface;
GstBuffer *buffer = nullptr;
GstFlowReturn ret;
GstMapInfo map = {0};
gpointer data = NULL, user_data = NULL;
GstMemoryFlags flags = (GstMemoryFlags)0;
if (-1 == NvBufSurfaceFromFd(surfaceID, (void **)(&surface)))
{
g_printerr("Cannot get NvBufSurface from id.");
}
else
{
surface->numFilled = 1;
NvBufSurfaceSyncForDevice (surface, 0, 0);
NvBufSurfaceUnMap (surface, 0, 0);
user_data = g_malloc(sizeof(uint64_t));
GST_INFO ("NvBufSurfaceAllocate %lu", surface->surfaceList[0].bufferDesc);
*(uint64_t *)user_data = surface->surfaceList[0].bufferDesc;
data = g_malloc(sizeof(NvBufSurface));
buffer = gst_buffer_new_wrapped_full(flags,
data,
sizeof(NvBufSurface),
0,
sizeof(NvBufSurface),
user_data,
notifyToDestroy);
buffer->pts = mTimestamp;
gst_buffer_map (buffer, &map, GST_MAP_WRITE);
memcpy(map.data, surface , sizeof(NvBufSurface));
gst_buffer_unmap(buffer, &map);
g_signal_emit_by_name (mAppSrc, "push-buffer", buffer, &ret);
//gst_buffer_unref(buffer);
}
mTimestamp += 33333333;
}
Additionally, you’ll notice I have commented out gst_buffer_unref(buffer). I did this because it was causing a segfault, because it probably wasn’t instantiated or written to properly.
I’ve been attempting to use the following pipeline string to generate the gstreamer pipeline bin used to push the nvbufsurface to an appsrc:
"appsrc name=appsrc ! nvvideoconvert ! autovideosink"
Console output is as follows:
./multi_camera "appsrc name=appsrc ! nvvideoconvert ! autovideosink"
Set pipeline string to appsrc name=appsrc ! nvvideoconvert ! autovideosink
Gstreamer appsrc element found.
Starting stream...[INFO] (NvEglRenderer.cpp:110) <renderer0> Setting Screen width 1280 height 720
Argus Version: 0.98.3 (multi-process)
CONSUMER: Waiting until producer is connected...
CONSUMER: Producer has connected; continuing.
CONSUMER: Waiting until producer is connected...
CONSUMER: Producer has connected; continuing.
CONSUMER: Waiting until producer is connected...
CONSUMER: Producer has connected; continuing.
CONSUMER: Waiting until producer is connected...
CONSUMER: Producer has connected; continuing.
CONSUMER: Render frame 1
Pushing frame to gstreamer pipeline at time stamp: 0
NvMapMemCacheMaint Bad parameter
nvbusurface: NvBufSurfaceSyncForCpu: Error(4) in sync
CONSUMER: Render frame 2
Pushing frame to gstreamer pipeline at time stamp: 33333333
NvMapMemCacheMaint Bad parameter
nvbusurface: NvBufSurfaceSyncForCpu: Error(4) in sync
CONSUMER: Render frame 3
Pushing frame to gstreamer pipeline at time stamp: 66666666
NvMapMemCacheMaint Bad parameter
nvbusurface: NvBufSurfaceSyncForCpu: Error(4) in sync
CONSUMER: Render frame 4
Pushing frame to gstreamer pipeline at time stamp: 99999999
NvMapMemCacheMaint Bad parameter
nvbusurface: NvBufSurfaceSyncForCpu: Error(4) in sync
CONSUMER: Render frame 5
Pushing frame to gstreamer pipeline at time stamp: 133333332...
I am SURE I’m doing something wrong here, but I am admittedly a bit lost in the sauce with the NvBufSurface API. Any help is appreciated!
Thanks!