Deepstream 7.0 Service Maker RTSP Out

Please provide complete information as applicable to your setup.

• dGPU: Nvidia A40
• Deepsream 7.0
• NVIDIA GPU Driver Version: 555.42.02
• Question?
• In all of the samples in Service Maker it shows using a nveglglessink for the sink. Is there an RTSP Streaming sink output or is it expected that the developer implement this? In previous Deepstream versions using the Deepstream App it was easy to output via RTSP by simply changing the config file.

I have the following test code as we are looking at potentially switching over to using the Service Maker.

#include <iostream>
#include <string>

#include "deepstream_config.h"
#include "pipeline.hpp"
#include "RTSPServer.hpp"

#define MUXER_WIDTH 1920
#define MUXER_HEIGHT 1080
#define TILER_WIDTH 1280
#define TILER_HEIGHT 720
#define CONFIG_FILE_PATH                                                       \
  "/opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/"            \
  "xod-optical-deepstream-2/configs/config_infer_primary_yoloV8.yml"

using namespace deepstream;

int main(int argc, char *argv[]) {
  std::string sink = "udpsink";

#if defined(__aarch64__)
  sink = "nv3dsink";
#endif
  try {
    Pipeline pipeline("XOD");
    pipeline.add("nvurisrcbin", "src", "uri", "file:///opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/xod-optical-deepstream-2/20240610_154255_F1D6_B8A44F8E2E18.mkv")
        .add("nvstreammux", "mux", "batch-size", 1, "width",
             MUXER_WIDTH, "height", MUXER_HEIGHT)
        .add("nvinfer", "infer", "config-file-path", CONFIG_FILE_PATH,
             "batch-size", 1)
        .add("nvvideoconvert", "converter")
        .add("nvdsosd", "osd", "display-text", 0)
        .add("nvv4l2h264enc", "encoder", "copy-timestamp", 1, "idrinterval", 5, "iframeinterval", 8)
        .add("rtph264pay", "rtppay")
        .add(sink, "sink", "port", 5004, "host", "localhost")
        .link("mux", "infer", "converter", "osd", "encoder", "rtppay", "sink")
        .link({"src", "mux"}, {"", "sink_%u"});

        pipeline.attach("infer", "measure_fps_probe", "my-probe");

        RTSPServer rtsp_server;
        rtsp_server.addStream("/test1", 5004, "H264");
        rtsp_server.start();

        pipeline.start().wait();

  } catch (const std::exception &e) {
    std::cerr << e.what() << std::endl;
    return -1;
  }

  return 0;
}

I created a separate RTSPServer class that starts the GStreamer RTSP Server.
RTSPServer.hpp

#ifndef RTSP_SERVER_HPP
#define RTSP_SERVER_HPP

#include <gst/rtsp-server/rtsp-server.h>
#include <gst/gst.h>
#include <thread>
#include <atomic>
#include <vector>
#include <string>

class RTSPServer {
public:
    RTSPServer();
    ~RTSPServer();

    void start();
    void stop();
    void addStream(const std::string &mountPoint, int port, const std::string &encodingName);

private:
    std::thread server_thread;
    std::atomic<bool> running;
    GstRTSPServer *server;
    std::vector<GstRTSPMediaFactory*> factories;

    static void server_thread_func(RTSPServer *server);
};

#endif // RTSP_SERVER_HPP

RTSPServer.cpp

#include "RTSPServer.hpp"
#include <iostream>

RTSPServer::RTSPServer() : running(false), server(nullptr) {}

RTSPServer::~RTSPServer() {
    stop();
}

void RTSPServer::start() {
    running = true;
    server_thread = std::thread(&RTSPServer::server_thread_func, this);
}

void RTSPServer::stop() {
    running = false;
    if (server_thread.joinable()) {
        server_thread.join();
    }

    if (server) {
        g_object_unref(server);
        server = nullptr;
    }

    for (auto factory : factories) {
        if (factory) {
            g_object_unref(factory);
        }
    }
    factories.clear();
}

void RTSPServer::addStream(const std::string &mountPoint, int port, const std::string &encodingName) {
    if (!server) {
        server = gst_rtsp_server_new();
        gst_rtsp_server_set_service(server, "8554");
        gst_rtsp_server_attach(server, nullptr);
    }

    GstRTSPMediaFactory *factory = gst_rtsp_media_factory_new();
    guint64 udp_buffer_size = 512 * 1024;
    std::string launchPipeline = "( udpsrc name=pay0 port=" + std::to_string(port) + " buffer-size=" + std::to_string(udp_buffer_size) +
                                 " caps=\"application/x-rtp, media=video, clock-rate=90000, encoding-name=" + encodingName + ", payload=96\" )";
    gst_rtsp_media_factory_set_launch(factory, launchPipeline.c_str());
    gst_rtsp_media_factory_set_shared(factory, TRUE);
    factories.push_back(factory);

    GstRTSPMountPoints *mounts = gst_rtsp_server_get_mount_points(server);
    gst_rtsp_mount_points_add_factory(mounts, mountPoint.c_str(), factory);
    g_object_unref(mounts);

    std::cout << "Added stream at rtsp://127.0.0.1:8554" << mountPoint << std::endl;
}

void RTSPServer::server_thread_func(RTSPServer *server) {
    gst_init(nullptr, nullptr);

    while (server->running) {
        g_main_context_iteration(nullptr, false);
    }
}

However, when I attempt to play the video from VLC I get the following GStreamer Warnings/Errors:

(Sample:18818): GLib-GObject-CRITICAL **: 16:16:06.832: g_object_get_is_valid_property: object class 'GstUDPSrc' has no property named 'pt'
**FPS:  20.00 (20.15)
**FPS:  20.00 (20.13)
**FPS:  20.00 (20.12)
**FPS:  20.00 (20.11)
0:00:59.127894677 18818 0x7fef70002440 WARN               rtspmedia rtsp-media.c:3594:wait_preroll: failed to preroll pipeline
0:00:59.127915499 18818 0x7fef70002440 WARN               rtspmedia rtsp-media.c:3964:gst_rtsp_media_prepare: failed to preroll pipeline
0:00:59.128350449 18818 0x7fef70002440 ERROR             rtspclient rtsp-client.c:1087:find_media: client 0x7fef1c6dca50: can't prepare media
0:00:59.128430991 18818 0x7fef70002440 ERROR             rtspclient rtsp-client.c:3376:handle_describe_request: client 0x7fef1c6dca50: no media

Any help would be greatly appreciated. Thanks

Thank you for your suggestion. We will confirm this requirement.
Also could you try to set the port number from 5004 t0 8554?

I have attempted several ports all with the same issue(including switching from 5004 to 8554), it’s as if the udpsink output isn’t actually working as expected.

I ran the program with GST_DEBUG=3 and I am seeing the following output:

root@docker-desktop:/opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/xod-optical-deepstream-2/build# GST_DEBUG=3 ./Sample 
Initializing GStreamer Backend...!
Add Element ... src
Add Element ... mux
Add Element ... infer
Add Element ... converter
Add Element ... osd
Property copy-timestamp is not supported by object encoder
Add Element ... encoder
Add Element ... rtppay
Add Element ... sink
LINKING: mux -> infer
LINKING: infer -> converter
LINKING: converter -> osd
LINKING: osd -> encoder
LINKING: encoder -> rtppay
LINKING: rtppay -> sink
LINKING: Source: src Target: mux
0:00:00.514097173  5691 0x5570289ed000 ERROR            nvstreammux gstnvstreammux.cpp:1611:gst_nvstreammux_request_new_pad:<mux> Pad should be named 'sink_%u' when requesting a pad
Plugin measure_fps_probe initialized

The ERROR "Pad should be named ‘sink_%u’ seems like it could be the culprit. Any ideas on a fix for this?

You can try to use the pad name to link the “mux”, like .link({"src_0", "mux"}, {"", "sink_%u"});

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.