Package Audio2Face app or render viewport in ex. Flutter

Hi all! I’ve built a small plugin on the Audio2Face TTS with Riva to generate a talking avatar in realtime. However I have some trouble understanding how I could use this for some sort of production / rendering in other formats. The ideal thing for me would be to render the viewport in fullscreen in an app made in Flutter (still figuring out the steps for this though) and send API calls directly.
Right now I’ve made an extension to Audio2Face which gains input from API and then does TTS through RIVA and streams this to A2F which works perfectly. Is there any way for me to package this workflow in some way or to at least get the output from just the viewport in some format without having to stream it on WebRTC? Or is the only way to have A2F running as instance in the background and communicate with it?

2 Likes

Hello @martin.kjellberg! Wow! That’s incredible! I can’t wait to see your extension in action. I shared this post with the Audio2Face team for more help!

3 Likes