
While Siri is the usual public focus of Apple’s artificial intelligence endeavors, a new Backchannel story points out that much of the behavior iPhone users notice in their device is also powered by the company’s increasing push to improve its AI. When the iPhone guesses which apps you’ll want to use next when you swipe your screen, that’s Apple’s AI at work, but the device also lashes together pieces of information from several places to provide a more complete picture. Machine learning helps Apple devices do everything from extending battery life between charges to identifying a caller who isn’t in its contact list by referencing emails. It’s also responsible for the iPad Pro knowing the difference between the Apple Pencil’s touch and the palm being dragged alongside it while a user is drawing, accepting the Pencil’s input while rejecting the palm’s. “If this doesn’t work rock solid, this is not a good piece of paper for me to write on anymore — and Pencil is not a good product,” said senior vice president of software engineering Craig Federighi. “If you love your Pencil, thank machine learning.”
Pushing toward improvements in AI has changed product design in other ways too, with the speech recognition needs of the Siri team even influencing fundamental aspects of the iPhone’s design. “It’s not just the silicon,” Federighi said. “It’s how many microphones we put on the device, where we place the microphones. How we tune the hardware and those mics and the software stack that does the audio processing. It’s all of those pieces in concert. It’s an incredible advantage versus those who have to build some software and then just see what happens.” The ability to mold its products to fit the needs of its software gives Apple something of an edge over competitors who are just creating software, but without access to a search engine’s mass of data or user information that Apple makes a point to protect, the company has to make sure most of its AI lives locally on a device. Starting with iOS 10, Apple will be using a technique called Differential Privacy to collect data from a large pool of users without capturing information that identifies individual users, but most of the iPhone’s practical machine learning enhancements still live on the device itself. The company said an iPhone uses information about app usage, interactions with other people, neural net processing, a speech modeler and “natural language event modeling” to come to its conclusions and offer suggestions, all of which takes up an average of 200 MB of space on the device. “It’s a compact, but quite thorough knowledge base, with hundreds of thousands of locations and entities. We localize it because we know where you are,” Federighi said. “And it’s working continuously in the background.”