FFmpeg lacks a real-time filter for motion interpolation. There is minterp but it’s too slow at anything other than simplistic frame blending mode.
From what I can tell, the OF API just deals with generating the motion vectors. So what would be the broad approach to taking the output and producing intermediate frames? Would it use CUDA or some other API?
Or are there any examples out there of how the NVIDIA OF API is being used to perform framerate up conversion?
The only thing I’ve found so far is SVP, which now uses OF, but I think that is a closed-source project.