Audio2Face + Live Link + Pixel Streaming

I’m trying to setup the A2F Live Link plugin for Unreal 5.2 with Pixel Streaming and I’m having trouble with the audio output. The blendshapes work great and the audio outpus can be listened but ONLY in the actual application, it’s not being streamed through Pixel Streaming.

I guess it’s because Pixel Streaming doesn’t take into account the SubmixListener, which is what’s being used by the plugin as specified in the Docs. Is there any way to route the incoming audio output to be also streamed?

1 Like

Have you tried checking Enable Audio Stream in the StreamLivelink node?

If you did and it’s still not working, can you share your stage?

Hi Ehsan! Thanks for the reply

Yes, the audio streaming works because I can hear it in the Editor as well as the Standalone Game, the problem comes from the Pixel Streaming plugin, it’s not capturing the audio correctly from the A2F Live Link Plugin

I’m facing the exact same problem. Running the application standalone will stream the audio from A2F correctly, but opening the client (Browser) through PixelStreaming will not let me hear the sound correctly. While other sounds work on the client the audio stream from A2F does not work.

1 Like

Here’s the answer from our engineering team:

This seems to be a very Unreal Engine specific problem. Can you share your detailed workflow? We will need all the steps and assets to reproduce the issue.


1.- I’m packaging a simple unreal game with one level. In the level there’s a Metahuman character and it has been setup with Live Link as specified in the A2F docs. There is just one camera and also an ambient sound. I have enabled the Omniverse Audio2Face plugin and the PixelStreaming plugin.

2.- I run A2F with the correct ports and settings (Audio Stream).

3.- In the Unreal Editor if I try to play an audio in Audio2Face I can see and hear it correctly, both blendshapes and audio get streamed correctly to Unreal.

4.- I try in the packaged game and the same result, I can see it and hear it well.

5.- The problem comes when I start the pixel streaming. I have a simple setup as shown in the Unreal docs, I just start the signalling server and connect to the default frontend. I see my character and hear the ambient sound from the web client. When I play an audio in A2F, the blendshapes get streamed correctly, but the audio doesn’t get captured and shared by the Pixel Streaming plugin. I know the audio is getting to Unreal because the packaged game (I need to have it opened for pixel streaming) is reproducing it. It’s just that it doesn’t get streamed to the web server.

That’s it, I hope it’s clear enough. My guess is that the problem comes from the way the A2F plugin plays the audio in Unreal (through the SubmixListener), that it’s not compatible at all with the PixelStreaming plugin.

Thank you!

P.d: I’m actually not using A2F Standalone, but the A2F microservice (ACE). But I don’t think this is relevant since both send the audio data to Unreal the same way

I’m using A2F Standalone and facing the exact same problem. So it indeed does not matter if you use A2F standalone or through ACE.

Same issue. Streaming via PixelStreaming loses audio from Audio2Face plugin, and it’s very sad

This seems to be a known bug 4221217. Please keep an eye here for updates.

Where can I track this bug?

any updates on this? i finally got everything working on my project but there is no audio from audio2face to pixelstream, other sounds work, i hear it on my project, but when i use the browser there is no audio

This is an internal bug reporting system. Just checked and it’s in progress. Will update here when done.

Making sure i understand this correctly. Currently UE Pixel Streaming with audio2face there will be no audio? theres no work around? its not working for everyone right?

1 Like

I have the exact same problem - any info you can share on a possible workaround or maybe even the root cause?

from the sound of it, the circumstance is consistent across public A2F users. and, if there is a workaround, i am sure @Ehsan.HM would let us know and keep us posted. so, until then, it’s probably best for us to wait for the bugfix in a future release.

i finally got a work around and now have my Virtual Chat-Gpt Metahuman pixel streaming :)

right before i send the audio file to audio2face i send file location to ue, delay .5 seconds in ue and play the audio. using Runtime audio importer. cheesy, but it works

1 Like

would you please show more details? How to use Runtime audio importer to get the send audio in UE

I have the exact same problem. Have you solved this problem or any suggestions to aviod it?

i mostly do everything in python. Have a widget in ue above the metahuman, the widget sends the input text through a web-socket , then GPT, then whisper, send the mp3 to audio2face and at the same time, send the file name thru a web-socket to ue.

1 Like

Thank you very much!!!, but where should this blueprint be put in when UE get the live link streaming?