Live Streaming through A2F Controller

To animate Metahuman in UnrealEngine, I successfully implemented the following flow.

image

In the above solution, Audio2Face establishes TCP connection to UnrealEngine (default port 12030/12031), which means A2F must be deployed in the same LAN and that is not an ideal solution for container-based deployment like K8S.

Hence I am trying to upgrade the flow to be more friendly for cloud-based deployment, that is to say, UnrealEngine is running out of the cloud and hence A2F cannot establish a TCP connection on its own activity.

In this reason, I should use A2F Controller instead. Here is the arch diagram from Nvidia document.
image

Can we have a setup as below?

  • The diagram draws an arrow line from A2F Controller to UnrealEngine to consume the animdata and audio. But in fact that TCP connection is established by UnrealEngine to connect to A2F Controller to subscribe the data.
  • The audio-producer is another application apart from UnrealEngine
  • There could be multiple UnrealEngine instances sharing the same A2F controller and A2F. Hence for each of them, each UnrealEngine is assigned with an individual stream. How can we identify multiple UnrealEngine instances in this case when producing audio stream to A2F controller.
  • Live Link plugin cannot be used in this case, so I should write my own plugin to consume animdata in UnrealEngine, right?

image