NvMedia IEP not encoding after prefence signaled

Please provide the following info (tick the boxes after creating this topic):
Software Version
DRIVE OS 6.0.10.0
DRIVE OS 6.0.8.1
DRIVE OS 6.0.6
DRIVE OS 6.0.5
DRIVE OS 6.0.4 (rev. 1)
DRIVE OS 6.0.4 SDK
other

Target Operating System
Linux
QNX
other

Hardware Platform
DRIVE AGX Orin Developer Kit (940-63710-0010-300)
DRIVE AGX Orin Developer Kit (940-63710-0010-200)
DRIVE AGX Orin Developer Kit (940-63710-0010-100)
DRIVE AGX Orin Developer Kit (940-63710-0010-D00)
DRIVE AGX Orin Developer Kit (940-63710-0010-C00)
DRIVE AGX Orin Developer Kit (not sure its number)
other

SDK Manager Version
2.1.0
other

Host Machine Version
native Ubuntu Linux 20.04 Host installed with SDK Manager
native Ubuntu Linux 20.04 Host installed with DRIVE OS Docker Containers
native Ubuntu Linux 18.04 Host installed with DRIVE OS Docker Containers
other

Issue Description
I am writing some unit tests around my usage of the IEP engine and getting weird errors when using sync fences. I’m not sure if my understanding or usage is somehow wrong.

Error String

I see these following errors printed out on Orin
tegraH265EncodeGetStatus: SyncPoint wait Error
tegraH265EncodeGetStatus: SyncPoint wait Error

Logs

Here is a sample code snippet modified slightly to remove company utilities. I’m trying to essentially test that the encoder is not processing the image until after a prefence is signaled and NvMediaIEPBitsAvailable is timed out until a prefence is signaled.

NvMediaIEPRegisterNvSciSyncObj(encoder_context_, NVMEDIA_PRESYNCOBJ, *pre_sync_obj_)
NvMediaIEPRegisterNvSciSyncObj(encoder_context_, NVMEDIA_EOFSYNCOBJ, *eof_sync_obj_)
// pre fence is used to signal the encoder to start
NvSciSyncFence pre_fence = NvSciSyncFenceInitializer;
// eof fence is used to signal the consumer that the encoder has finished
NvSciSyncFence eof_fence = NvSciSyncFenceInitializer;
EXPECT_EQ(NvSciSyncObjGenerateFence(pre_sync_obj_, &pre_fence),
NvSciError_Success);
NvMediaIEPInsertPreNvSciSyncFence(encoder_context_, &pre_fence)
NvMediaIEPFeedFrame(
encoder_context_, buf_obj, &encoder_pic_params_, encoder_instance_id_);
NvMediaIEPGetEOFNvSciSyncFence(encoder_context_, *eof_sync_obj_, fence)

// Line1:
EXPECT_EQ(NvMediaIEPBitsAvailable(
encoder_context_, &num_bytes_available,
NVMEDIA_ENCODE_BLOCKING_TYPE_IF_PENDING, 1000)), NVMEDIA_STATUS_TIMED_OUT);
// This signals encoding to start
//Line 2:
EXPECT_EQ(NvSciSyncObjSignal(pre_sync_obj_), NvSciError_Success);
// Wait for the encoder to finish
//Line 3:
EXPECT_EQ(NvMediaIEPBitsAvailable(
encoder_context_, &num_bytes_available,
NVMEDIA_ENCODE_BLOCKING_TYPE_IF_PENDING, 1000)), NVMEDIA_STATUS_okay);
std::vector<uint8_t> buffer(num_bytes_available);
EXPECT_OK(encoder_->GetEncodedData(buffer));

The previous code snippet fails and if I set the timeout for the NvMediaIEPBitsAvailable to infinite on Line 3, then it waits infinitely as if the prefence was not signaled.

If I just comment out line 1 then it works perfectly as if I’m not allowed to query if the data is ready before waiting.

I’ve also thought, maybe I’m not allowed to query before the data is signaled to be ready. So I added some waits on the eof fence, but that also doesn’t work.

EXPECT_EQ(NvSciSyncFenceWait(&eof_fence, cpu_wait_context_, 1000 * 1000),
NvSciError_Timeout);
// Line1:
EXPECT_EQ(NvMediaIEPBitsAvailable(
encoder_context_, &num_bytes_available,
NVMEDIA_ENCODE_BLOCKING_TYPE_IF_PENDING, 1000)), NVMEDIA_STATUS_TIMED_OUT);
// This signals encoding to start
//Line 2:
EXPECT_EQ(NvSciSyncObjSignal(pre_sync_obj_), NvSciError_Success);
// Wait for the encoder to finish
EXPECT_EQ(NvSciSyncFenceWait(&eof_fence, cpu_wait_context_, 5000 * 1000),
NvSciError_Success);
//Line 3:
EXPECT_EQ(NvMediaIEPBitsAvailable(
encoder_context_, &num_bytes_available,
NVMEDIA_ENCODE_BLOCKING_TYPE_IF_PENDING, 1000)), NVMEDIA_STATUS_okay);

Can you help me understand, if I’m misusing the encoder?

Is it fence or eof_fence in above snippet?

Its eof_fence. There isn’t such an issue in the actual code, I had sanitized some of it for this post.