Can i use audio2face in headless mode with unreal engine metahumans with the livelink plugin?

How can i use audio2face in headless mode with unreal engine metahumans with the livelink plugin?
I know that i can use api to send audio files to audio2face and start playing audio the same way. But can i run audio2face in headless mode, start unreal engine 5 project, send tts sound file and play the animation using livelink and api requests?

Yes this is possible. I tried it using Python like so:

import requests

base_url = "http://localhost:8011"

# connect to server
status = requests.get(base_url + "/status").json()
if status != "OK":
    print("ERROR: unable to reach A2F")

# load a usd file
data = {"file_name": "path_to_usd_file.usd"}
response = requests.get(base_url+"/A2F/USD/Load", json=data).json()
print(response)

# get livelink streaming node
data = {"node_path": "/World/audio2face/StreamLivelink", "value": True}
response = requests.post(base_url+"/A2F/Exporter/ActivateStreamLivelink", json=data).json()
print(response)

# play
data = {"a2f_player": "/World/audio2face/Player"}
response = requests.post(base_url+"/A2F/Player/Play", json=data).json()
print(response)
1 Like