I'm interested in how OF could be used for frame interpolation with FFmpeg

FFmpeg lacks a real-time filter for motion interpolation. There is minterp but it’s too slow at anything other than simplistic frame blending mode.

From what I can tell, the OF API just deals with generating the motion vectors. So what would be the broad approach to taking the output and producing intermediate frames? Would it use CUDA or some other API?

Or are there any examples out there of how the NVIDIA OF API is being used to perform framerate up conversion?

The only thing I’ve found so far is SVP, which now uses OF, but I think that is a closed-source project.

Hi.

are there any examples out there of how the NVIDIA OF API is being used to perform framerate up conversion?

Supporting such a use-case is on our roadmap and we may add something for this in our future SDK releases.

Thanks.

That would be great.

Is there crossover with the NGX slo mo stuff? At the moment the slomo stuff is too slow for real-time motion interpolation but it would be great to have a real-time framerate up-conversion solution using AI. Maybe some sort of simplified version of the NGX slo mo code could be used?