I’m trying to use json data with metahuman rig, but found that json values doesn’t equal usd values. Any advice or mapping guide to repeate usd results? for ex. lipSuck, pucker in json doesn’t equal any animated curves in usd file.
We now include 52 blendshapes matching ARkit now, you can give that a try.
@yseol for vis. He is out of the office at the moment and will reply when he is back.
Hi @4omp,
I’m trying to use json data with metahuman rig, but found that json values doesn’t equal usd values
Can you please give more detail on what you mean by usd values
or animated curves in usd file
?
Which usd are you referring to? usd from audio2face export? or somehow did you export a metahuman rig as .usd file?
Hi, thank you for response!
After some time struggling, found that i have error: "Error: Cannot find curve timesamples " when trying to export from asset male_bs_arkit.usd to UE5.
Json exports fine, but USD not.
Also some strange values in JSON that doesn’t lookls like in A2F preview (look at 2nd screenshot)
Hi @4omp, sorry for getting back late.
I’ve checked 2501_arkit.json and 2501_arkit.usd
. They have some different values in the beginning of the animation but they have the same output values after the beginning part. Depending on the temporal_smoothing
value, they can have slightly different value at the beginning of the animation due to the solve result of the past.
can you please explain how you get the metahuman face that you attached? is it from the .json file? as far as I know we cannot load the .json file into UE directly? Did you load the .json into Maya and then use the .fbx to load it into UE?
found that i have error: "Error: Cannot find curve timesamples " when trying to export from asset male_bs_arkit.usd to UE5.
I don’t clearly understand this part. Is this an error from A2F or from UE5?
This error happened in UE5, when i try to import USD as face animation. This happened only if i use export with arkit head from a2f.
I made custom import based on maya script. So i have all blendshapes that works correctly if i drive it manually (with sliders in UE5). But when i use data from a2f json file, some parts looks very strange, for ex. brows always stick to the top, mouth funnel & pucker looks strange also, mouth not closing or sucking in lips and so on.
I show all my steps in this twitter thread, so if you have time can look at those problems.
Hi, I think I have a similar issue. Importing the generated .usd files into UE5 just does not seem to work.
The error message is the same: “LogOmniverseImporter: Error: Cannot find curve timesamples in…(file.usd)”
I’m using Audio2Face 2022.1.1 and UE 5.1.1.
My steps:
- Using the provided “male_bs_arkit.usd”, set up the blendshape solver. Visually it looks great.
- Export the .usd file
- In UE5, import a sample Metahuman
- Using the UE5 connector, try to import a Facial Animation with the .usd file as the input and “Face_Archetype_Skeleton” as the skeleton. This throws the error.
I have attached the exported .usd file.
ue5test2.usd (153.1 KB)
In fact, we haven’t added the metahuman conversion part yet in the case of using “male_bs_arkit.usd” and export .usd.
We will add it in the next release. In the meantime, you will have to use “male_bs_46.usd” for importing the facial animation into Metahuman.
Sorry about inconvenience, we will address this soon!
@4omp wow, your work is amazing! So you have two issues?
- .usd import into UE5 not working (I answered above)
- some strange expression when you use json file ← but how do you import .json into the UE5 ??
But when i use data from a2f json file, some parts looks very strange, for ex. brows always stick to the top, mouth funnel & pucker looks strange also, mouth not closing or sucking in lips and so on.
thank you!
i convert json data to udp packets and send it to UE5, frame by frame.
@4omp, As I don’t know you conversion process, and how the converted values map to MH controls, it’s hard to see what causes the wrong expressions.
json data with 46 shapes needs a conversion to be compatible to MH controls.
If you use arkit-blendshape, you can try turn on/off some poses. For example, Mark usually have brow raise expression all the time, so it can make your MH to have brow raise all the time.
This preset file is what we uploaded for arkit-blendshape for MH control.
omniverse://localhost/NVIDIA/Assets/Audio2Face/Samples/blendshape_solve/preset/preset_arkit_metahuman.json
If you load this file using the load preset button, you can test this preset.