I followed the character transfer tutorial on the nvidia omniverse tutorial, but it does not work the same way for me. the material breaks, and the driver mesh does not drive my mesh. here is a link to my video showing the problem; Nvidia omniverse driver mesh problem - YouTube
Hi @MrTegs, what if you just use default parameters instead of 100? can you try the default values? they works well for most cases.
Hi @MrTegs, this is very strange.
You selected mark before you add A2F pipeline, but in the UI a different mesh is selected. (Root/Pilot_Head_02/Mesh).
If you’re okay to share the face file, we can test it here too to see what causes the issue.
here’s a link
Hi @MrTegs, so your face mesh has separate face skin, eyeball, mouth meshes all combined into one mesh. If you separate out this mesh into individual pieces and just apply the character transfer to the main face skin mesh, it works as in the following image.
To separate the mesh, you can use your favored DCC app and export to .usd, or you can use the audio2face’s function, Mesh - Mesh separate.
THANK YOU. now i can animate it. but i can’t find a tutorial on transferring the animation to unreal engine. and i can’t setup blendshape conversion with it. please showme if you can
Hi, Audio2face to Unreal engine tutorial is here.
and you can find all audio2face tutorial from the playlist there. Hope it helps!
this only works for metahuman characters. i guess the only work around is to use metahuman
Hi,
I have the same problem. At which stage do I do mesh separate?
Hi @qazs , as soon as you bring in your mesh, apply mesh separate, then only work with the face skin mesh.
If you want a clean preparation, you can also use other DCC (e.g. maya, blender) to separate mesh, and bring only the face skin mesh.
Thanks, I managed to get a clean material using mesh separate method. However I couldn’t achieve the realistic material result that I got from the original metahuman.
The model on the left is transferred correctly, the one on the right is the original prop.
Was looking at this video: AI-Powered Facial Animation with Omniverse Audio2Face - YouTube
How did he retain the original material?
Hi again, yeah depending on how the original uv and textures are configured, mesh separate might not give what you want in terms of material. Doesn’t re-assigning or re-configuring materials resolve your issue?
Our demo asset is not a metahuman, we made it internally.
I see, I thought that’s from metahuman. I’m not sure how to configure materials…
Will be good if you can create a tutorial to fix the material issue so we can do everything in OV without exporting back to UE again.
I bet you have already seen this topic, but there are some valuable videos for you:
Specially you should check the video from the post:
A lot of progress here :) Apr 23
thanks for the suggestion @qazs. In fact, the mesh separate function will be updated in the next release and this uv/texture problem will be addressed. Please stay tuned.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.