In this release - We’ve added the Omniverse Live Link plugin. Enabling the workflow to Live Stream animation from Audio2Face to Unreal Engine where it can be connected to a MetaHuman Character.
The Audio2Face Live Link Plugin allows creators to stream animated facial blendshape weights and audio into Unreal Engine to be played on a character. Either Audio2Face or Avatar Cloud Engine (ACE) can stream facial animation and audio. With very little configuration a MetaHuman character can be setup to receive streamed facial animation, but the plugin may be used with many other character types with the correct mapping pose assets.
To Install the Plugin.
Navigate to the directory shown below in your Audio2Face installation folder.
\audio2face-version\ue-plugins\
Choose which version plugin for your UE installation and open folder.
Find the “omniverseLiveLink” directory.
Copy the “OmniverseLiveLink” directory into the Plugins directory of your UE project.
Great job. I have downloaded and try this version. Able to work fine for streaming from audio2face → Unreal Engine. (I am using 5.0.2)
However, i have one issue. I use female voice track in audio2face. Yet when it’s streamed into Unreal, the voice became male … Anyone know how to resolve this?
Thanks for the answer. It’s indeed the Audio Sampling Rate (my wave file is in 22.05Khz)
I have another issue.
I am sending in a long wave track (50secs) via Headless Audio2Face API → Stream to Unreal.
After some intervals, the face & the audio out of sync.
I am thinking of chunking the long wave track into smaller pieces (by sentence wave) and then batch it to
audio2face. However, after combing through the API, i cannot find another start_play the root folder in batch mode. There’s only solve ExportBlendshapes API in batch mode. (which will batch the face weights to Unreal, But no audio stream)
Would there be a start_play API in batch mode? If not, what could be the possible solution in this use case?
@justin123 Could you kindly start a new thread in the Audio2Face Forum regarding this issue? It would greatly improve visibility and increase the chances of getting helpful answers. Thanks!
@Ehsan.HM this plugin only supports both 5.1.15 and 5.2.14 (the two of three subfolders from the “audio2face-ue-plugins” folder) and not backwards compatible, correct?
We don’t currently have plans to support older versions of Unreal. We might do this in the future though, just no promises :)
That said, the current released versions can be used as a reference to make an Unreal 4 compatible version. The name of the extension is omni.avatar.livelink and can be found on disc using extension manager.
I’m looking how to use Audio2Face in Unreal-Engine. But currently It should execute both program Omniverse and UE5. My program will get the data in real-time in real world and it will convert the data to audiofile. Could I use the Audio2Face in Compile version of UE project?? or Could I use the Audio2Face without execute the Omniverse?
And If I can I want to use Audio2Face within code. not GUI. Is this topic possible?
Using Audio2Face in Unreal Engine (UE) with Omniverse:
Audio2Face is designed to work with NVIDIA Omniverse, which means that for optimal functionality, both Omniverse and UE5 might need to run concurrently. This is especially true if you’re using the standard setup without modifications.
Using Audio2Face in a Compiled Version of UE:
Typically, plugins or tools like Audio2Face should be usable within a compiled version of a UE project, provided they are correctly integrated and all dependencies are met. However, the specific integration details and compatibility might vary based on the versions and any updates to either Audio2Face or UE.
Using Audio2Face without Omniverse:
While Audio2Face is designed to work seamlessly with Omniverse, using it independently might require some workarounds or custom integrations. It’s essential to check the documentation or consult with NVIDIA’s support to understand the feasibility and any limitations.
Using Audio2Face Programmatically:
If you prefer to use Audio2Face within code and not through the GUI, you’d likely need to delve into the API or SDK provided by NVIDIA for Audio2Face. This would allow for more customized and automated workflows, but it might also require a deeper understanding of the tool’s internals.
If your project involves creating a mobile application or if you need to integrate the functionalities of UE and Audio2Face into a mobile platform, collaborating with experts offering mobile app development services in Miami could be beneficial. They can provide guidance on optimizing the workflow, ensuring compatibility, and potentially creating a more streamlined user experience.
In conclusion, while your goals seem achievable, they might require a combination of standard procedures and custom solutions. It’s always a good idea to consult directly with the tool’s developers (in this case, NVIDIA) and possibly collaborate with specialized development services to ensure a smooth integration.
How can I compile a UE project with LiveLink PlayerStreaming enabled?
Whenever I try to package a UE project with Omniverse LiveLink enabled from within UE, I get the error saying "Expecting to find a type to be declared in a module rules named ‘OmniverseAudioMixer’ in UE5Rules.
go to the \Plugins\NVIDIA\OmniverseLiveLink folder and open the “OmniverseLiveLink.uplugin” file for editing (in notepad or anything that can edit ASCII).
locate the line: “Type”: “Editor” and change “Editor” for “Runtime”. The line should read:
“Type”: “Runtime”
Make sure you are using the latest connector in your project.
Let us know if that works.
The source code is available, and you can try compile the plugin for any new versions of UE yourself. But if you are not familiar with the precedure, you can wait for the next release which is around November.