Is there a way to obtain audio2gesture output just like with Audio2Face headless server? I am trying to livestream audio2gesture with UE5 character. OmniLive seems to only works with adjusting the assets rather than syncing character animation.
Hi, same question, let’s search for the answer together. I can share how I built graphs on A2F, but I don’t know how to connect this blog to streaming. Maybe you can guess, but from the looks of it this is the only option. Although I’m thinking of making an OSC server plugin and streaming the movements through it.