Hi, I want to use the xsen mocap to live stream the human motion to Isaac Sim, I thought there are two ways. 1. xsens support live stream to UE5 and it support omniverse app, but usd file doesn’t support live sync of UE5’s blueprint. 2. Using animation graph’s pose provider function to do it, but you only mentioned its’ able to register an external input but didn’t say how to use it. Is there any instructions to tell how to reach this goal?
Hi @hawkeex - I have followed-up with the team internally. According to them, this capability will be available in the future releases this year.
Hi, is it means that there is no official api to transmit the pose data to each skeleton joint? I think maybe I can do this by using the subscribe ros joint state function in action graph, but I can’t find a way to use the pose provider, can you tell me the data structure of it? Like what’s the sequence of the data?
Hi!
ROS joint state subscriber specifically subscribes to ROS messages. If you are not using ROS then you won’t get any use out of that joint state subscriber. But you should be able to use the articulation controller if the skeleton is an articulation.
Thanks! I would try the articulation controller first!