I have watched all the videos I can find on YouTube, and still can’t this working a year later. I think if you explain this to me, I can make a video that explains this better for Character Creator users.
Thanks
I have watched all the videos I can find on YouTube, and still can’t this working a year later. I think if you explain this to me, I can make a video that explains this better for Character Creator users.
Thanks
Audio2Face should work on any asset from any other software. As long as the character meets the requirements, e.g. upper teeth are separate from lower teeth, the skin is one connected mesh, etc.
Please take a look at this tutorial and let us know if you have any questions: Audio2Face Setting up a Custom Character - YouTube
I don’t get the same results as the videos I have watched. The teeth and jaw end up out of the character. I just get stuck and don’t know how to continue.
Do teeth have transform values? A2F expects all meshes to not have any transform values on them.
This time I have the teeth in the right place. I am at the point of creating A2F Pipeline. I click it, and I get this prompt.
I don’t see where to put in the left and right Eye XForms, or the Lower Denture XForms.
I set the Left and Right Mesh and the Lower Denture Mesh. Do I say ‘Yes, Attach’ here?
I clicked yes, attach, and it worked. Now I just have to learn how to export it.
Only thing is, the open mouth version doesn’t move. Only Mark and my character.
Is the open mouth character needed?
Glad you got this working. The green mesh is not supposed to move. It’s only used at the character transfer stage for showing Audio2Face which points of this model corresponds with which points on your model.
You must select the Full Face
(not Regular Audio Player
) to see eyes and teeth connections.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.