Enabling / disabling Webrtc streaming results in high cpu load

We have an application that creates and manages kit processes (KitAgent)…

The KitAgent starts a new and process and sets “/app/runLoops/main/rateLimitFrequency” to 1 to bring up a new pooled process that can be used later and that does not produce and cpu or gpu load…


So far so good…

Next we load a stage (and set “/app/runLoops/main/rateLimitFrequency” to 120)…

Now we close the stage (and set “/app/runLoops/main/rateLimitFrequency” back to 1)…


It takes some time, but cpu and gpu load come down to 0% (as expected)

Next we enable WebRTC streaming (and set “/app/runLoops/main/rateLimitFrequency” to 60 to save some resources)…

Now we disable WebRTC streaming by disabling the omni.kit.streamsdk.plugins extension and set “/app/runLoops/main/rateLimitFrequency” back to 1…

As you can see, the cpu load remains very high, despite the stage is unloaded and the streaming extensions are disabled again. Without enabling WebRtc the load comes down back to 0 but as soon as you enable streaming once, the cpu load never comes down…

…I think this is a bug - right?

Thanks

Carl