I can export blendshape animation json file by the Audio2Face A2F Data Conversion manually, but we need a backend service to export json file automatic, is there any method or code can do that?
Hi @zobnechopin1, welcome to Audio2Face.
We have this batch process function in A2F (When A2F is back on the Omniverse launcher soon).
You can batch process many audio files and output .json files. Please have a look.
Thanks for your solution yesol~ It’s a available way to solve batch export.But this solution need to click the button in the A2F UI Interface, I want to achieve this by a python script like ‘test_client.py’ in ‘streaming_server’ Folder~
Hi again, thanks for the feedback!
At the moment we only support a UI-based batch process, and a script based solution is not in our plan currently. Sorry about that.
Got it！Thanks for your reply ~
@zobnechopin1 what are your use cases? love to know more
Technically, you can create scripts to automate this. But we don’t have a headless mode atm if that is what you are looking for. Let us know more details. Thanks
My usage scenario is to load a specified or any head model (with BS parameters), the user uploads a piece of audio or a piece of text, drives it through A2F, automatically exports the BS data and inputs it into Unreal for rendering, so I’m going to build a UI-based automation scripts to implement service capabilities, but there is not much documentation and code to refer to. This technology is very good, but if there is no secondary development capability, this technology can only be used as a demonstration, and there is not much industrial value.
You can easily extend it with python. Most of the interface and some of the features are all python based. We don’t have enough doc around this atm but feel free to poke around. We have customers who have successfully build on top of the app.
You can do this through the live sync api from other engines like Unreal you mention. You do have to decide how you send the data over. We have different streaming protocol options to do that. Some people use simple TCP IP also to stream data out to their own apps for live ARKit like sync.
Some applications are in the middle of integrating this. Stay tuned for that. In the meantime if you have specific questions after you start poking around the code, feel free to ask.
Thanks for your help and support, I have started to study the source code, and when the program is developed, I will make a video to help other developers.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.