Vulkan interop for Video Codec SDK as seen in the AppMotionEstimationVkCuda


I am using Vulkan for rendering an offscreen image, which I would like to encode using the Video Codec SDK. I was following the AppMotionEstimationVkCuda example as a reference on how to supply the images into the Video Codec SDK. Are there any other publicly available examples of such interop, other than the example mentioned? My problem are the external semaphores in Vulkan. I am also presenting the offsceen image on screen for me to be able to see the scene while encoding it and this is where the problems with semaphores begin. This requires the external semaphore to be used as a signal semaphore when the rendering finishes and as a wait semaphore in the presentation queue submit. The problem is that it seems that the Vulkan SDK validation layers are not really complete when it comes to external semaphores or my usage of them was not expected. Validation reports that my external semaphore can not be signaled, which is due to the nature of state tracking of the validation layer, which has conditions to only properly track internal semaphores, but does not hesitate to report error also for the external semaphores, even though the signal tracking is not done for them. My Vulkan app was working fine without any validation errors, until I started to perform the vkGetSemaphoreWin32HandleKHR call. Are there any users with an experience with these issues?

You can look at how mpv handles this via libplacebo. You’ll need to do a fair amount of reading, as libplacebo wraps the semaphore usage, but it’s the only complete real world usage of interop that I’m aware of.

In addition to what ‘langdalepl’ suggested, you can also take a look at
There are a few samples that demonstrate how to use Vulkan CUDA interop e.g. simpleVulkan, simpleVulkanMMAP, vulkanImageCUDA.
Those samples along with the sample from NVIDIA Video Codec SDK should help you.

If you still run into any issues, please provide a minimal source code reproducer to be able to help.