How we can USE real-time motion capture data(BVH file) to visualize human in omnivores?

I want to visualize the motion capture data in omniverse and also I want to run python code and illustrate the values on the screen. Is this possible in the omniverse? If so, is there any source to get the idea for implementation?


There is no direct BVH support in Omniverse at this time. You could load the BVH in a supported application like Maya or 3ds Max and get your data into Omniverse that way.