Get frame system time

Hi,
my setup JP4.6.1 AGX, imx264 2448x2048.

How to convert frame timestamp to system time. AGX has gps receiver with pps.

I read all topics but it is not clear for me.

Now i am here:

EGLStream::IArgusCaptureMetadata * iArgusCaptureMetadata{ Argus::interface_cast<EGLStream::IArgusCaptureMetadata >( frame ) };
Argus::CaptureMetadata * metadata{ iArgusCaptureMetadata->getMetadata() };
Argus::Ext::ISensorTimestampTsc * iSensorTimestampTsc{ Argus::interface_cast< Argus::Ext::ISensorTimestampTsc >( metadata ) };

const auto ts{ iSensorTimestampTsc->getSensorSofTimestampTsc() - m_offsetNs };
const auto monotonicRawNow{ hc::utils::getMonotonicRawNow() }; // clock_gettime
const auto nanoNow{ hc::utils::nano() }; // chrono now
const auto tsNow{ nanoNow - ( monotonicRawNow - ts ) };

tsNow is real clock time?

Best regards, Viktor.

Suppose the monotonicRawNow and nanoNow could have time gap due to it get in different calls.

Is it right formula?

( monotonicRawNow - ts ) always about 1ms.

Nano is RTC time.

How to obtain frame RTC time?

Here is my funcs:

uint64_t getOffsetNs()
{
    std::ifstream buf{ "/sys/devices/system/clocksource/clocksource0/offset_ns" };
    std::stringstream strStream;
    strStream << buf.rdbuf();
    return std::stoull( strStream.str() );
}

uint64_t getMonotonicRawNow()
{
    // Get the current time from steady_clock
    struct timespec tss;
    clock_gettime( CLOCK_MONOTONIC_RAW, &tss );
    return { uint64_t( 1000000000UL ) * int64_t( tss.tv_sec ) + int64_t( tss.tv_nsec ) };
}

Please help.

Using the timspec64 and ns_to_timespec64 would be good.

Yes, of course, for 64bit arch timespec used 64bit digits.

So the question about:

epoch_ns = X + StartOfFrame_ns

How to find X?

What’s the epoch_ns time?

System time from 1970 in nanoseconds

I don’t think there’s a way to get it precisely.

What the best precision (gap) can i achieve? Which way?

Please advice me, i will try.
May be i should store some time when capture started or boot time or some other starting point?

Here is my approach:

  1. Stored monotonic_raw ( startTime ) and now ( startTs ) when first frame acquired.
  2. time = startTime + ( ts - startTs )

Am i right?

#pragma omp parallel for schedule(static) num_threads(6) 
for( cameraId = 0; cameraId < m_camerasNum; ++ cameraId )
{
    Argus::UniqueObj< EGLStream::Frame > frame{ m_iFrameConsumers.at( cameraId )->acquireFrame() };

    EGLStream::IArgusCaptureMetadata * iArgusCaptureMetadata{ Argus::interface_cast< EGLStream::IArgusCaptureMetadata >( frame ) };
    Argus::CaptureMetadata * metadata{ iArgusCaptureMetadata->getMetadata() };
    Argus::Ext::ISensorTimestampTsc * iSensorTimestampTsc{ Argus::interface_cast< Argus::Ext::ISensorTimestampTsc >( metadata ) };

    EGLStream::IFrame * iFrame{ Argus::interface_cast< EGLStream::IFrame >( frame ) };
    if( not iFrame )
    {
        std::cerr << "Failed to acquire frame! " << cameraId << std::endl;
    } else
    {

        const auto ts{ iSensorTimestampTsc->getSensorSofTimestampTsc() - m_offsetNs };
        const auto monotonicRawNow{ hc::utils::getMonotonicRawNow() };
        const auto nanoNow{ hc::utils::nano() };
        const auto diff{ monotonicRawNow - ts };
        if( startTs.at( cameraId ) == 0 and startTime.at( cameraId ) == 0 )
        {
            startTs.at( cameraId ) = monotonicRawNow;
            startTime.at( cameraId ) = nanoNow;
        }

        const auto time{ startTime.at( cameraId ) + ( ts - startTs.at( cameraId ) ) };

		const auto timeStr{
			"Timestamp: " +
			std::to_string( cameraId ) + " " +
			std::to_string( iFrame->getNumber() ) + " " +
			std::to_string( ts / 1000000 ) + " " +
			std::to_string( time / 1000000 ) + " " +
			std::to_string( diff / 1000000 ) + " " +
			std::to_string( nanoNow / 1000000 )
		};
		
		// record
        if( m_doCapture and iFrame->getNumber() > m_skipFrames )
        {
            std::cout << timeStr << std::endl;

            EGLStream::NV::IImageNativeBuffer * iNativeBuffer{ Argus::interface_cast< EGLStream::NV::IImageNativeBuffer >( iFrame->getImage()) };
            if( not iNativeBuffer )
                std::cerr << "IImageNativeBuffer not supported by Image!" << std::endl;
            else
            {
                if( m_dmaBuffers.at( cameraId ) == - 1 )
                     m_dmaBuffers.at( cameraId ) = iNativeBuffer->createNvBuffer( m_streamSize, NvBufferColorFormat_YUV420, NvBufferLayout_Pitch );

                if( m_dmaBuffers.at( cameraId ) != - 1 )
                {

                    if( iNativeBuffer->copyToNvBuffer( m_dmaBuffers.at( cameraId ) ) != Argus::STATUS_OK )
                    {
                        std::cerr << "Failed to copy frame to NvBuffer!" << std::endl;
                    } else
                    {

                        const auto filename{ m_recordPath + "/" + std::to_string( cameraId ) + "/" + std::to_string( iFrame->getNumber() ) + ".jpg" };
                        std::ofstream outputFile{ filename };
                        if( outputFile.is_open() )
                        {
                            uint8_t * buffer{ m_OutputBuffers.at( cameraId ) };
                            unsigned long size{ bufferSize };
                            if( m_JpegEncoders.at( cameraId )->encodeFromFd( m_dmaBuffers.at( cameraId ), JCS_YCbCr, & buffer, size ) == 0 )
                            {
                                outputFile.write( ( char * ) buffer, size );
                                dataFile << timeStr << std::endl;
                            } else
                                std::cout << "Failed to encode jpeg!" << std::endl;
                        }
                    }
                } else
                    std::cerr << "Failed to create NvBuffer!" << std::endl;
            }
        }
     }
}

This is hardware synchronized 6 sensors imx264 24fps with saving frames to ssd. Where every frame has GPS time ( gps receiver with pps connected to AGX ).

Here is result in milliseconds:

Timestamp: 4 224 21475137 1697055720376 47 1697055720424
Timestamp: 5 203 21475137 1697055720376 51 1697055720428
Timestamp: 0 224 21475137 1697055720376 55 1697055720432
Timestamp: 1 223 21475097 1697055720335 80 1697055720416
Timestamp: 2 224 21475137 1697055720376 68 1697055720445
Timestamp: 1 224 21475137 1697055720376 86 1697055720463
Timestamp: 3 224 21475137 1697055720376 87 1697055720464
Timestamp: 0 225 21475178 1697055720417 54 1697055720471
Timestamp: 5 204 21475178 1697055720417 58 1697055720475
Timestamp: 2 225 21475178 1697055720417 66 1697055720483
Timestamp: 4 225 21475178 1697055720417 70 1697055720487
Timestamp: 3 225 21475178 1697055720417 84 1697055720501
Timestamp: 1 225 21475178 1697055720417 84 1697055720502
Timestamp: 0 226 21475219 1697055720457 47 1697055720506
Timestamp: 5 205 21475219 1697055720457 51 1697055720510
Timestamp: 2 226 21475219 1697055720457 59 1697055720517
Timestamp: 4 226 21475219 1697055720457 70 1697055720528
Timestamp: 1 226 21475219 1697055720457 83 1697055720541
Timestamp: 3 226 21475219 1697055720457 83 1697055720542
Timestamp: 0 227 21475260 1697055720498 51 1697055720550
Timestamp: 5 206 21475260 1697055720498 59 1697055720558
Timestamp: 4 227 21475260 1697055720498 54 1697055720553
Timestamp: 2 227 21475260 1697055720498 70 1697055720569
Timestamp: 3 227 21475260 1697055720498 86 1697055720585
Timestamp: 1 227 21475260 1697055720498 86 1697055720585
Timestamp: 4 228 21475300 1697055720539 52 1697055720592
Timestamp: 2 228 21475300 1697055720539 63 1697055720603

Do i need to subtract offsetNs from sensor timestamp in JP4.6.1?
Or i can use iMetadata->getSensorTimestamp() instead ( SoF - offsetNs )?

Both of them are need to substract offsetNs to convert to host system time.
Due to those time are RTCPU system’s time.

Thank you.

Here i made more calculations.

  1. iMetadata->getSensorTimestamp() do not need subtract offset. It already has it.
Timestamp: 0 147 ts:1088121 ts2:1078565 ts3:1088169 of:9603 
Timestamp: 5 126 ts:1088121 ts2:1078565 ts3:1088169 of:9603 
Timestamp: 1 147 ts:1088121 ts2:1078565 ts3:1088169 of:9603 
Timestamp: 3 147 ts:1088121 ts2:1078565 ts3:1088169 of:9603 
Timestamp: 2 148 ts:1088161 ts2:1078606 ts3:1088209 of:9603 
Timestamp: 4 148 ts:1088161 ts2:1078606 ts3:1088209 of:9603 
Timestamp: 0 148 ts:1088161 ts2:1078606 ts3:1088209 of:9603 

ts - getSensorSofTimestampTsc() - m_offsetNs
ts2 - getSensorTimestamp() - m_offsetNs
ts3 - getSensorTimestamp()
of - ns_offset

Please comment it.

  1. So getSensorSofTimestampTsc() - getSensorTimestamp() has offset?
    Sometimes exactly the same values as on the picture below.

They are start of frame time both, are not they?

Please help.

I think the getSensorSofTimestampTsc() is the same with getSensorTimestamp()

But they never equal. Why? Did you read my calculatons?

EGLStream::IArgusCaptureMetadata * iArgusCaptureMetadata{ Argus::interface_cast< EGLStream::IArgusCaptureMetadata >( frame ) };
Argus::CaptureMetadata * metadata{ iArgusCaptureMetadata->getMetadata() };
Argus::Ext::ISensorTimestampTsc * iSensorTimestampTsc{ Argus::interface_cast< Argus::Ext::ISensorTimestampTsc >( metadata ) };
const Argus::ICaptureMetadata * iMetadata{ Argus::interface_cast< const Argus::ICaptureMetadata >( metadata ) };
const auto ts{ iSensorTimestampTsc->getSensorSofTimestampTsc() - offsetNs 
const auto ts3{ iMetadata->getSensorTimestamp() };

|ts - ts3| ~ 50-500ms

It is right code?

Can you provide data which shows that you think?

As i see in JP4.6.1:

  1. getSensorTimestamp() do not need to subtract offset_ns
  2. getSensorSofTimestampTsc() and getSensorTimestamp() very differs

May be it depends on hardware?
May be depends on Argus::EGL_STREAM_MODE_MAILBOX or Argus::EGL_STREAM_MODE_FIFO modes?

Can you point me to any post topic where issue was investigated and confirmed?

May be Nvidia have real developer to help consumers?

ShaneCCC are you developer?