In a November 10th posting on the topic of using Audio2Face in headless mode with the Unreal Engine metahuman and LiveLink plugins, there was a reference to a piece of python code that, upon running it, seemed to have no effect, with hints such as [Detail:Method Not Allowed], [Status:error, [message:/world/audio2face/streamlivelink is not valid]], [Status:error, [message:/world/audio2face/streamlivelink is not valid]. message:/world/audio2face/streamlivelink is not valid],[status:error,massage::/world/audio2face/player-Regular player not found]
Please, is it possible to have a more complete set of tutorials to show how to properly use the API to achieve an Unreal Engine packaged Metahuman model that calls audio2face and Live Link in headless mode.
If not, could there be more information on how to properly use the API to achieve the ability to control Audio2face in headless mode. Thanks!