Does cloud xr support streaming screen touch events for IOS devices?

In my app virtual joysticks should appear for devices with touch screen, but when launched through cloud xr they do not. And it looks like my IPhone detected just as a display device.
I would also like to know if the IOS device collects data from the front camera, and whether it is possible to get them too, if so?

The default samples for arkit+arcore do not currently pass along some kind of platform-agnostic ‘gestures’ to the server to interpret. You’ll need to pass the relevant events as they flow through. The arcore sample shows the example of passing along a ‘tap’ as a rudimentary starting point:

    cxrInputEvent input;
    input.type = cxrInputEventType_Touch;
    input.event.touchEvent.type = cxrTouchEventType_FINGERUP;
    input.event.touchEvent.x = x;
    input.event.touchEvent.y = y;
    cxrSendInputEvent(cloudxr_receiver_, &input);

But it’s up to you how you want to translate touch for your application. You can use generic input events and make your own data structure.