- - Hi, this is Valentina Palladino for Ars Technica and this is an iPhone X with its front-facing camera. [soft upbeat music] We generally think of these as selfie cameras but Apple's newest iPhone uses it for much more. It houses something called the TrueDepth sensor array, a 3D camera that can recognize you and your facial expressions. There are a lot of little components in that notch above the display. They include a seven megapixel camera, a proximity sensor, an ambient light sensor, a flood illuminator, an infrared camera, and a dot projector. The dot projector projects over 30,000 infrared dots onto your face. A 3D map of your face is created and analyzed by the phone's processor. Apple trained a neural network with millions and millions of data points so that the phone can distinguish you from another person. Apple says there's only a one in one million chance that a random person could get into your phone just by looking like you. TrueDepth works kind of like Microsoft's Connect but the difference is that it's portable and advances in machine learning have made it far more powerful. Apple has shown off two uses for this. The first is Face ID, which lets you unlock your phone just by looking at it. The second are Animojis, which map your real facial expressions onto animated characters. But that's not all it can do. Apple made the technology available to app developers and we're only just starting to see the things that it could be used for. The app might be able to tell when you're getting flustered and change its behavior. You could find yourself playing an interactive game that reads your facial expressions so the characters in the game can respond to how you're feeling and acting. Hands-free interfaces could be developed for controlling an app based on your facial movements. You could chat live with someone over FaceTime or Skype wearing a fictional face. In certain situations, it could even be used to assist in medical diagnoses. We talked to an iOS developer who has made many popular games and apps using the TrueDepth sensor about how it works and what it can be used for in the future. - I made as app called Rainbrow. It allows you to play a game using your eyebrows. You put your eyebrows up like you're surprised and the little character jumps up. You put your eyebrows down like you're mad and he jumps down. It's really cool because it's a special new way of interacting with a game or interacting with your device in general. It almost feels like you're controlling it with your mind when you're holding your phone, you're not tapping the screen, you're not moving it in any special way, but you can see it react to what you're doing. It's just like Animoji the first time people use it. It's just fun, just something about it feels like the phone, the computer there, actually understands you. There's a lot of cool applications of the technology. You could use it to determine how the user's feeling in a certain moment. If they're looking at some piece of content, you could figure out if they like that or not. You could also do it in reaction to certain events. You could say if they're playing some kind of game or they're interacting with something else, you might be able to offer some kind of help. If they look frustrated, you might offer some kind of suggestion of here's how to solve the problem they might be facing. Looking far into the future, if this technology is spread to everyone, one of the big applications, I think, is accessibility. People who might not be able to have fine motor skills needed to control current software applications might be able to use facial expressions to control their devices. Additionally, you could have other unique interfaces that are controlled with facial expressions. Things like games right now are just scratching the surface. They're just saying here's a new input mechanism, let's see how we can make that fun. I think other things, you could use apps that are controlled with your face. Maybe you can look at different things on the screen and then it'll react according to where you look. I saw a prototype recently where they created an app that allows you to look at different parts of the screen and press buttons by blinking. You'd look up to the top right or top left and blink and it would press a button. I thought that was a really cool application of the technology. - As with many new technologies, the TrueDepth camera poses some concerns that you should be aware of. Apple doesn't save any of your face data on its servers. It's all saved locally onto your phone but Apple does allow developers to fetch some data like facial expressions and Face ID but only if they get your explicit permission to do so first. In theory, this technology could be used for sentiment tracking when displaying ads. Apple tells developers that it cannot use the information for that. However, that might be hard to enforce across the many apps in the App Store. - Privacy is extremely important. I think that especially with a camera being able to passively detect not only what you're doing but how you might be feeling or what you might be looking at is especially concerning to a lot of people. Right now, all apps have to go through app review and you have to explicitly have a privacy policy displayed to users that lets them know how you're using the camera data. I think it is something that's very concerning and people should be aware of. Things like if you were to use this to detect what someone's feeling while they're looking at an advertisement and collaborating that with other data that they have is definitely a concern. - With the new iPhones coming in the fall, we're gonna see a lot more of these cameras in people's hands. The more popular they become, the more app developers will do amazing things with them. But it's not just about Face ID. New interfaces, new games, new ways to communicate and new privacy and security concerns are all gonna become a reality. [soft upbeat music]