Audio2Face live link Metahuman Audio Query

Hey guys, I just want to check regarding the audio portion of the live link.

My goal is to have/set up a listener of sorts to do something to the metahuman upon receiving the audio from the live link.

I’ve successfully set up the live link, but I am not sure where is the audio being assigned/played on the unreal engine side of things.

I’ve tried the on live link updated event, which does not work since animation data is updating from the a2f app for idle face.

Appreciate any insights regarding this matter. Thx in advance. :)

I am trying the same thing. On Livelink updated basically works like a tick inside the editor. Audio Component → isPlaying is returning false, when audio is playing.

Tried creating an OSC server (with port 12031) from the Livelink instance object reference → Bind Event to on OSC Message received did not help either.

Blockquote “I am not sure where is the audio being assigned/played on the unreal engine side of things”

Audio2face Live Link documentation quotes When audio is streamed to the A2F Live Link plugin it is replayed using the SubmixListener. May be there is a way to check if the submixListner is playing inside unreal and take the action.

I am using a Livelink Controller in UE. Any event which is triggered when audio or animation starts to stream ? I notice that the status indicator goes green in the UI. Is there anyway some custom event be added to a2f Livelink controller , which sort of indicates that the blendshape is streaming or not ? I am aware its more of UE question than a2f, any hints either development or testing would like to share would be greatly appreciated.

Custom variables are available in blueprint. I would consider subject name, but anything can be used.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.