Jetson TX2 RTSP Streaming, FFmpeg or Gstreamer?

You may use the videowriter with a gstreamer pipeline producing the h264 stream and sending to shmsink, so that you can get it from shmsrc inside test-launch:

#include <iostream>

#include <opencv2/opencv.hpp>
#include <opencv2/videoio.hpp>

int main ()
{
  //setenv ("GST_DEBUG", "*:3", 0);

  /* Setup capture with gstreamer pipeline from onboard camera converting into BGR frames for app */
  const char *gst_cap =
    "nvarguscamerasrc  ! video/x-raw(memory:NVMM), format=(string)NV12, width=(int)640, height=(int)480, framerate=(fraction)30/1 ! "
    "nvvidconv    ! video/x-raw,              format=(string)BGRx ! "
    "videoconvert ! video/x-raw,              format=(string)BGR  ! "
    "appsink";

  cv::VideoCapture cap (gst_cap, cv::CAP_GSTREAMER);
  if (!cap.isOpened ()) {
    std::cout << "Failed to open camera." << std::endl;
    return (-1);
  }
  unsigned int width = cap.get (cv::CAP_PROP_FRAME_WIDTH);
  unsigned int height = cap.get (cv::CAP_PROP_FRAME_HEIGHT);
  unsigned int fps = cap.get (cv::CAP_PROP_FPS);
  unsigned int pixels = width * height;
  std::cout << " Frame size : " << width << " x " << height << ", " << pixels <<
    " Pixels " << fps << " FPS" << std::endl;

  cv::VideoWriter h264_shmsink
     ("appsrc is-live=true ! queue ! videoconvert ! video/x-raw, format=RGBA ! nvvidconv ! "
      "omxh264enc insert-vui=true ! video/x-h264, stream-format=byte-stream ! h264parse ! shmsink socket-path=/tmp/my_h264_sock ",
     cv::CAP_GSTREAMER, 0, fps, cv::Size (width, height));
  if (!h264_shmsink.isOpened ()) {
    std::cout << "Failed to open h264_shmsink writer." << std::endl;
    return (-2);
  }

  /* Loop for 3000 frames (100s at 30 fps) */
  cv::Mat frame_in;
  int frameCount = 0;
  while (frameCount++ < 3000) {
    if (!cap.read (frame_in)) {
      std::cout << "Capture read error" << std::endl;
      break;
    }
    else {
      h264_shmsink.write (frame_in);
    }
  }

  h264_shmsink.release();
  cap.release ();

  return 0;
}

Build, then launch it. It would write into /tmp/my_h264_sock.

So now you would launch your RTSP server with:

./test-launch "shmsrc socket-path=/tmp/my_h264_sock ! video/x-h264, stream-format=byte-stream, width=640, height=480, framerate=30/1 ! h264parse ! video/x-h264, stream-format=byte-stream ! rtph264pay pt=96 name=pay0 "

Just checked from localhost with:

gst-launch-1.0 -ev rtspsrc location=rtsp://127.0.0.1:8554/test ! application/x-rtp, media=video, encoding-name=H264 ! rtph264depay ! h264parse ! omxh264dec ! nvvidconv ! videoconvert ! xvimagesink

…seems to work fine so far, at least in this case.

Got surprizing results with nvv4l2 encoder/decoder, so I propose using OMX plugins for now with this case.
For using VLC on receiver side, some extra work may be required (EDIT: mainly need to use property do-timestamp=1 for shmrc in test-launch pipeline).

Note that sockets are created by shmsink, but won’t be deleted if a shmsrc is still connected.
So after each trial when your app and test-launch are closed, remove any remaining socket with:

rm /tmp/my_h264_sock*

before trying again.