FaceTime Attention Correction, when enabled, adjusts the set of your eyes so that it looks like you’re making eye contact with the person you’re FaceTiming even when you’re looking at the iPhone’s screen rather than the camera itself. It’s a little difficult to explain, so we’ve made a hands-on video to demo how it works.
[youtube https://www.youtube.com/watch?v=J6HA27izRp8&w=560&h=315]
Subscribe to the MacRumors YouTube channel for more videos.
When you’re using FaceTime, you naturally want to look at the display to see the other person you’re talking to rather than the camera, which has the effect of making you look like you’re not maintaining eye contact.
As can be seen in the video, iOS 13 corrects this and makes it so that when you’re looking at the iPhone’s screen, your gaze appears to be on the camera, allowing eye contact to maintained be maintained while still letting you keep your gaze on the friend or family member you’re FaceTiming with.
In iOS 12 and with FaceTime Attention Correction disabled, FaceTime looks like it always does – with no direct eye contact.
FaceTime Attention Correction appears to use an ARKit depth map captured through the front-facing TrueDepth camera to adjust where your eyes are looking for a more personal and natural connection with the person that you’re talking to.
Twitter users have discovered the slight eye warping that Apple is using to enable the feature, which can be seen when an object like the arm of a pair of glasses is placed over the eyes.
You can access FaceTime Attention Correction on iPhone XS, iPhone XS Max, iPhone XR, and 2018 iPad Pro models running the third developer beta of iOS 13. It’s a setting that’s available in the FaceTime section of the Settings app.
https://platform.twitter.com/widgets.js
https://www.macrumors.com/2019/07/03/ios-13-attention-correction-facetime/