Digital Mark

Hi,

During the SIGGRAPH 2021 talk “Graphics, AI, and the Emergence of Shared Worlds” Richard Kerris mentions that tools to create digital humans and a sample “digital Mark” is available today. How can one access these tools on Omniverse?

Additionally, he mentions that later this week there is a talk specifically designed for digital humans on Omniverse. Can you please share which talk he was referring to?

Thank you

1 Like

Hello @bartloms! Thanks for your interest!

I am getting confirmation on the release for the digital human tools from the dev team.

Here is the information that I have for the SIGGRAPH Talks

Realistic Digital Human Rendering with Omniverse RTX Renderer (On-demand)
Morteza Ramezanali, Eugene d’Eon, Blago Taskov, Simon Yuen
NVIDIA Share I Twitter I LinkedIn I Video

Realistic Digital Human Rendering with Omniverse RTX Renderer (On-demand)
Morteza Ramezanali, Eugene d’Eon, Blago Taskov, Simon Yuen
This session explores the intricacies, innovations, and future of bringing realistic digital humans to life. Addresses Omniverse’s RTX Renderer, OmniSurface, and various shader and renderer components.
(Coming Soon to NVIDIA Share)

We have the OmniSurface Shader Live and available now.
I do not have a timeline for muscles, Audio2Gesture.

Thank you for the links! None of these talks come up on SIGGRAPH site, which is strange.

If I may, can I ask a few more questions:

Is the Mark model going to be available as a sample?
Can you share more information about Audio2Gesture - searching Nvidia On-Demand brings up nothing on this topic.

Thank you

We have intentions to make Digital Mark available at some point. Will announce when that happens. Audio2Gesture is a technology we are using and previewing at the moment. Similar to Audio2Face, it uses a voice audio to drive the body motion that will move and behave in the context of the conversation. As shown with Virtual Jensen in the April GTC.

We will have more details in the upcoming GTC, stay tuned.

2 Likes

THE PREMIER CONFERENCE & EXHIBITION IN COMPUTER GRAPHICS & INTERACTIVE TECHNIQUES Digital humans can see and listen to users to understand the meaning behind the words. They can then use their own tone of voice and body language to create lifelike human conversations.

Hi @siyuen, any timelines for releasing Audio2Gesture feature?

Hi @user160248! I appreciate your interest in Audio2Gesture! I do not have any dates for when it will be released, but I do know they have a few bugs they are trying to knock out. I expect to hear something about this very soon.

Thanks a lot!