Is it possible in the AR/Gaze SDK to rotate eyes by by angular offsets I supply, rather than having the eye’s gaze always rotated towards the camera (as seems to be the case in the Gaze Redirection sample)?
In the sample app, the user’s eyes are transformed so that they seems to be looking at the remote party even if they are not.
What I want instead is rotate the gaze vector, so that if the user is looking at the remote party’s eyes on screen, the remote party will perceive eye contact, but if they’re looking elsewhere, eye contact won’t be perceived. This is of course what happens in real world conversations – you can tell when someone is looking at your eyes, and when they are not, and this guides the conversation.
In my application, I know the physical relationships between the camera, the user’s eyes, and the location of the remote party’s eyes on screen, so I know the angular offsets needed to rotate the gaze (vector) for gaze to be correct, rather than always showing the eyes as if they are looking at the remote party, even if they are not.
Is it possible to rotate apparent eye orientation to achieve this using the SDK?
Hi there! I understand what you’re looking for. I will reach out to the Maxine engineering team and get back to you when I have a response!
Terrific – very much appreciated Mark.
This is exactly what we need as well.
As soon as I know, you’ll know. I normally get a response in a day or so. If I don’t get anything back within 24 hours, I’ll ping again and keep you posted.
Just an update for you - engineering is aware but hasn’t responded with the information requested. I’ll get back on Monday, 24OCT22, with any additional updates. Thank you for your patience and have a great weekend!
Thanks again Mark. Looking forward to hearing if there’s a way to get the behavior we want.
Hi folks, I didn’t want to spam anyone. Engineering has been tied up. I’ll ping the Maxine product manager and see if I can get some movement.
Thanks Mark, very much appreciate the effort.
Hi folks, I haven’t heard back. I have put it on the radar. I will get back as soon as I hear any commentary from engineering. I apologize for not getting back to you more quickly!
Bumping. This would be enormously useful. There is good precedent for it here: GazeDirector: Fully Articulated Eye Gaze Redirection in Video (EG'18) - YouTube
I’ll follow up with engineering, been quite busy over the past few months and I appreciate the bumps.
@fusterclucks @Plumertle @cameronscwhite @pakoconk ---- Tagging all of you for optics on this reply.
Could you elaborate on your use case in this thread? I bumped an email thread to the top of inboxes and engineering requires information about use cases.
First, thanks for your continued help with this.
For us, the application is gaze-correct videoconferencing. By “correct”, I mean that a remote participant sees the local participant’s gaze accurately: if the remote party’s eyes are is being looked at, they perceive eye contact; if they the gaze is directed elsewhere, say at the desk, the remote party can perceive that.
This is important because gaze is a key social cue that helps regulate conversation. If you see that I’m looking at the desk in front of me, to use the example above, you understand that my attention is directed there (say to the widget we’re working on).
MAXINE’s current implementation is actually worse than nothing at all for the general case (understanding where the remote party’s gaze is directed), since it falsely represents the gaze vector in situations where gaze is not actually directed at the other party’s eyes).
Hope that helps.
I’ve provided that information to our engineering team, thank you!
I’m sure you’ve seen some of the recent reactions online to this feature, such as this one in Ars Technica that highlight the issue at hand. I hope the engineering team decides to act on the feedback.