Apple has developed a brand new feature referred to as FaceTime eye contact correction that provides a real-time adjustment during a call to make it seem like you’re looking directly into the camera.
As most people already know, during video calls, we typically look at the screen to see the person we’re talking to, and not directly at the camera. This means our contacts see us looking down, a thing that many have described as a way to avoid eye contact.
But with the release of iOS 13, all of these are going to change, as iPhones will come with a new feature that will correct this approach to make it seem like you’re actually looking into the camera.
Feature coming in iOS 13
Included in iOS 13 beta 3, FaceTime Attention Correction is based on ARKit, according to Dave Schukin, who has also published a demo of the feature (embedded below).
“How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly. Notice the warping of the line across both the eyes and nose,” he explains in his demo.
The warping, which you can also see in the photo attached to the article, makes your eyes seem like they are directed at the camera, and not down to the lower part of the screen.
While the feature appears to be working pretty smoothly, there’s one big limitation: because it runs on ARKit 3 APIs, it would only be available on the latest generation iPhones, namely the iPhone XS, iPhone XS Max, and iPhone XR. iPhone X, for instance, might not support this feature.
iOS 13, currently in beta, is projected to go live in September for all supported iPhone models, albeit as we can see, only some models will get the latest and greatest.
https://news.softpedia.com/news/apple-wants-you-to-make-eye-contact-when-using-facetime-526624.shtml