Plans for Live Stream to MotionBuilder Device?

Any plans to create a plugin Device for MotionBuilder so we can Live Stream Audio2Face results?

I do a lot of Live Realtime performance capture and would love to be able to have audio control the lip sync on the characters live as it does in the Viewport in Omniverse.

Faceware and Facecap for ARKit both has devices that stream the live BlendShape values back into MotionBuilder that can be used to drive other characters etc. I would imagine you guys should be able to do this as well?

Cheers!