UE4 Issue: Omni USD Sequence for Live in editor Take recording

Hi all,

We are currently testing out possible implementation of the Omniverse into our VP pipeline for both live in editor take recording and use for LED wall content. ATM we have only one issue left to solve, which is related to how Movie Render Queue and Take recorder works with animation in UE4.

Our issue is that the sequencer that is created from USD animation is not accessible by any other way than opening it from the level sequence actor. It does not exist anywhere within engine as discoverable asset that can be referenced for example like a subscene track or played by other sequencers. This prevents us from being able to play the animation during the live recording of our tracked camera. Our workflow is fully live “In-Editor”, not play/-game game mode.

So far we tried even copying all the keys from USD sequence into other sequencer track, but for some reason it causes UE4 to loose 100ms in performance when playing those same transforms. Original animation sequence from USD is playing smoothly with only 2-6ms cost. We are still trying to find out what is causing this weird behaviour of pasted keys.

There is also a bug, where using sequence export in sequencer menu is always causing engine to crash - my rough quess would be that it doesnt have any standard sequence asset to export from.

Another issue is when we try to use only autoplay feature as standalone sequence within the level during rendering in movie queue, it is always rendering out of sync. Most of the time looping several times faster than anything else in master sequence - even when framerate of all sequencers is the same.

So my main question would be, if there could be any way to have this sequence accessible, similar to how imported MDL materials work? Or any other workaround that would allow us to use it as a subsequence of other sequencers?

If we would be able to expose it to the engine as normal sequence asset, all those issues mentioned would be mostly irrelevant. Thank you for any comments or suggestions in advance.

Tom / VP Technical Artist

Tom, you have discovered some of the challenges we encounter due to the transient nature of some of the Unreal-specific data we maintain. I’ve summarized the issues here and would love to try and squash these as we try to make your workflow operate correctly. Thank you for your specific feedback.

Issues:

  • USD animation not accessible as a subscene track or played by other sequencers
  • Crash when using sequence export in sequencer menu
  • Out of sync rendering when using autoplay feature as a standalone sequence within the level during rendering in movie queue - also looping faster than anything else in master sequence

-Lou

Thank you very much for your response and looking into it.

If we manage to overcome this, we would love to start pushing Omniverse into our regular production pipeline, as USD is just so much cleaner and faster for all our departments than any other option, even more so with those Omniverse connectors and live updates.

For example yesterday our programmer managed to get hold of the reference to the omni usd sequence by custom script and create exact copy of it into our own sequence, but sadly it still suffers from that extreme 80-100ms engine slowdown when we try to play exact same animation tracks for the USD this way outside the sequence created by Omniverse.

It also seems to be directly related to the number of tracks played and goes down to 50ms when playing 4 000 tracks instead of 8 000. While in the original one we can play smoothly even over 16 000 transform tracks at once with no performance loss.

We also managed in a similar way to insert omniverse track as a subtrack into another sequence, but unfortunately the result is the same as previous attempt to copy it as our own sequence asset. This method also doesnt allow us to permanently apply quaternion interpolation to transform tracks of the USD sequence in UE4, because it will revert back after session. That helps us very often to solve some of the animations we need to process from Maya.

So it seems that even though we are able to access it thru some programming hacks, which we would be able to somewhat automate on level load and before renders, the huge ms drop on more complex tracks prevents us from from using it this hacked way for now.