GStreamer command for multi-camera capture with a common clock synchronization

Could you please provide a command to capture video from two cameras simultaneously, along with the timestamps from both cameras, so that their images can be stitched and synchronized? Here’s the command currently used to capture the feed:

gst-launch-1.0 -e nvarguscamerasrc sensor-id=0 timeout=10 ! ‘video/x-raw(memory:NVMM),width=1920,height=1080,framerate=29/1’ ! nvvidconv ! video/x-raw,format=I420 ! x264enc bitrate=2000 speed-preset=ultrafast ! h264parse ! h264parse ! qtmux ! filesink location=t0.mp4

gst-launch-1.0 -e nvarguscamerasrc sensor-id=1 timeout=10 ! ‘video/x-raw(memory:NVMM),width=1920,height=1080,framerate=29/1’ ! nvvidconv ! video/x-raw,format=I420 ! x264enc bitrate=2000 speed-preset=ultrafast ! h264parse ! h264parse ! qtmux ! filesink location=t1.mp4

Suppose need syn sensor HW design for it.

How to do that?

You can swap the cable to confirm if cable issue.

Couldn’t understand

OOPS reply to wrong topic.

I mean you need sync HW design for the case.

How can I get timestamp from camera?

Have a check below topic.

Please review the method I’m using to convert the monotonic clock to real time, as I am not seeing any offset present under the sys node in JetPack 6.1.

    IArgusCaptureMetadata *iArgusCaptureMetadata = interface_cast<IArgusCaptureMetadata>(frame);
    if (!iArgusCaptureMetadata)
        ORIGINATE_ERROR("Failed to get IArgusCaptureMetadata interface.");
    CaptureMetadata *metadata = iArgusCaptureMetadata->getMetadata();
    ICaptureMetadata *iMetadata = interface_cast<ICaptureMetadata>(metadata);
    if (!iMetadata)
        ORIGINATE_ERROR("Failed to get ICaptureMetadata interface.");
  //  CONSUMER_PRINT("\tSensor Timestamp: %llu\n",
  //                 static_cast<unsigned long long>(iMetadata->getSensorTimestamp())/*,
   //                iMetadata->getSceneLux()*/);

uint64_t monotonicRawStart = getMonotonicRawTime();

// Get the current CLOCK_REALTIME timestamp
uint64_t realTimeStartSec, realTimeStartNsec;
getRealTime(realTimeStartSec, realTimeStartNsec);

// Example sensor timestamp in monotonic time (for testing)
uint64_t sensorTimestamp = static_cast<unsigned long long>(iMetadata->getSensorTimestamp()); // Example sensor timestamp (monotonic)

// Convert sensor timestamp to real-time
uint64_t realTime = convertToRealTime(sensorTimestamp, monotonicRawStart, realTimeStartSec, realTimeStartNsec);

// Convert real-time to human-readable format
time_t realTimeSec = realTime / 1000000000; // Convert to seconds
struct tm *tm = localtime(&realTimeSec);
char buffer[80];
strftime(buffer, sizeof(buffer), "%Y-%m-%d %H:%M:%S", tm);

// Output the result
std::cout << " id "<<  iMetadata->id<<" Sensor Timestamp (ns): " << sensorTimestamp << std::endl;
std::cout << "Converted Real-Time: " << buffer << std::endl;

// Function to get the current MONOTONIC_RAW time in nanoseconds
uint64_t getMonotonicRawTime() {
struct timespec ts;
clock_gettime(CLOCK_MONOTONIC_RAW, &ts);
return ts.tv_sec * 1000000000 + ts.tv_nsec; // Convert to nanoseconds
}

// Function to get the current CLOCK_REALTIME in seconds and nanoseconds
void getRealTime(uint64_t& sec, uint64_t& nsec) {
struct timespec ts;
clock_gettime(CLOCK_REALTIME, &ts);
sec = ts.tv_sec; // Seconds
nsec = ts.tv_nsec; // Nanoseconds
}

// Convert sensor timestamp (in monotonic time) to real-time
uint64_t convertToRealTime(uint64_t sensorTimestamp, uint64_t monotonicStart, uint64_t realTimeStartSec, uint64_t realTimeStartNsec) {
uint64_t realTimeNs = sensorTimestamp + ((realTimeStartSec * 1000000000 + realTimeStartNsec) - monotonicStart);
return realTimeNs;
}

Check below topic for the offset_ns on JP6.

https://forums.developer.nvidia.com/t/how-to-get-the-clock-source-offset-ns-on-jetpack-6/300994/3