Apple unveils iOS 12, doubles down on performance and adds new AR, Screen Time features

As expected, Apple took the wraps off of iOS 12 today at its annual Worldwide Developers Conference, with Craig Federighi taking the stage to highlight what’s new in the company’s latest flagship operating system for the iPhone, iPad, and iPod touch. Federighi began by emphasizing that iOS 12 will be “doubling down on performance” and highlighting several metrics where iOS 12 will be improving performance, particularly on older devices. iOS 12 will be available to all of the same devices supported by iOS 11 — all the way back to the 2013 iPhone 5s — making it the largest base ever supported by an Apple release.

Using the iPhone 6 Plus as a baseline, Federighi noted that apps will launch 40 percent faster, the keyboard will come up 50 percent faster, and users can slide to take a photo at up to 70 percent faster than in iOS 11. Work has also been done optimizing the system under heavy load, allowing the share sheet to come up twice as fast. and that tight collaboration with the chip team has allowed the software developers to optimize iOS across the entire range of the A-series chips, being able to detect when more performance is required and ramp performance up quickly to meet demand, and ramp it down quickly to preserve battery life.

The changes in iOS 12 however go beyond performance. Federighi introduced ARKit 2, a new framework that will make it easier to experience augmented reality throughout the system. For iOS 12, Apple collaborated with Pixar to create a new file format called USDZ, a compact, single-file format optimized for sharing, and also developed as an open standard. USDZ files will be able to be used throughout the system and shared through apps like Messages and Mail and opened to be placed anywhere in a 3D AR environment. Abhay Parasnis, Executive VP & CTO of Adobe also took the stage to explain how Adobe will be bringing native USDZ support to its entire suite of Creative Cloud apps, allowing designers and developers to use familiar apps like Photoshop to create AR content and bring it easily into apps via USDZ, and Adobe apps will support WYSIWYG editing in AR. USDZ assets can also be included in Apple News, or in web pages, that allow 3D objects to be interactively experiences, even placed in real-world environments using the AR camera.

Federighi also introduced a new app, Measure, that will be included in iOS 12 to use Apple’s sophisticated AR engine and precisely calibrated sensors to measure objects, lines, rectangles, dimensions, lines along surfaces and even in three dimensions. Rectangles will be automatically detected and measurements quickly displayed.

ARKit 2 will also improve face tracking, offer more realistic rendering, 3D object detection, persistent experiences, and shared experiences, the last of which will deliver true multi-user AR, allowing users to see their own perspectives on common virtual environments, such as games. Martin Sanders, Director of Innovation at LEGO took the stage to show how the new features can combine physical and digital for more creative possibilities for play with LEGO kits. 3D Object detection lets ARKit 2 recognize models and bring them to life, and turn a LEGO set into an interactive adventure through an AR screen, dropping in LEGO characters, playing games, and completing challenges. Multi-user support also allows up to four friends to play in the same space, each on their own iOS devices.

The Photos app has also been improved in iOS 12 with better search features and a new “For You” tab similar in concept to those found in Apple Music and Apple News. The enhanced search now provides suggestions highlighting people, places, and categories that users may want to search for, and offers the ability to search for places by business name, or broad categories such as museums. Photos also now indexes over four million events by time and place, so users will be able to search for events by name or location and find photos that they took at those events.
took at those events. Search terms can also now be combined to search on multiple conditions.

The new “For You” tab will also present information on a single tab, such as memories, featured photos, “on this day” suggestions, effects suggestions, shared album activity, and more. Sharing suggestions will also appear that suggest who you can share photos with, such as from an event or even just an evening out, along with a recommended set of photos to share. Users receiving shard photos will also see a “Share back” prompt to suggest photos that they can pull from their own photo library to share back to the originator, great when several friends and snapping pictures when out together at an event.

A new Siri Shortcuts feature will allow third-party apps to go beyond SiriKit to expose quick actions through any user-defined custom phrase. Any iOS 12 app will now be able to allow users to add a custom Siri shortcut phrase that can be used to call up that app and display content based on a key phrase of their choice. Siri Suggestions will also allow third-party apps to make suggestions based on routine activities and locations, similar to what was done in iOS 9’s “Proactive Assistant” for internal apps like Calendar and Maps. So for example the Starbucks app could be opened automatically to remind users to pre-order their coffee each morning, or a workout app could suggest starting a workout when the user arrives at the gym.

iOS 12 also makes it clear what Apple has done with last year’s acquisition of Workflow, which has been morphed into a new Siri Shortcuts Editor for a new user-based shortcuts feature for Siri. The new built-in feature is structured almost identically to the Workflow app, and offers a tie-in to Siri that will allow users to create a custom phrase to chain multiple tasks into a routine. A gallery of pre-made shortcuts will be available, and users can customize these or create their own, so for example a shortcut could be set up to say “I’m coming home” that would automatically provide travel time and directions, send a text message to somebody, adjust a thermostat, and play a radio station.

Several of the core apps in iOS 12 are also getting some enhancements. Apple News will be gaining a new “Brows” tab that will feature channels and topics, making it easier to jump to favourites, and the iPad version of News gets a new sidebar to make it easier and more fun to navigate and dig into areas that users are more interested in. There was no word, however, on Apple News availability outside of the U.S., U.K., and Australia, suggesting it will still remain confined to those countries for now.

The Stocks app has been completely rebuilt with a whole new design, more charts, and Apple News integration for business news related to stocks. Stocks will also be coming to iPad with a user interface that takes advantage of the larger display. Voice Memos has been similarly rebuilt, with a new iPad version as well, and iCloud sync to share voice memos across all devices. CarPlay is also being enhanced to add support for third-party navigation apps, such as Google Maps and Waze.

Following reports earlier this year, Apple also previewed its revamped iBooks app, which as expected will now be called Apple Books an feature an all-new design, including “Reading Now” with a preview to pick up reading at the last point, a new store design, and more.

iOS 12 will also enhance the Do Not Disturb feature, including a “Bedtime” mode that will hide all screen notifications so users aren’t distracted by them when picking up an iPhone in the middle of the night, saving them until morning when the user can choose to display them when they’re ready for them. Do Not Disturb will also now allow users to set an ending time for a Do Not Disturb based on either a fixed time interval, the ending of the current calendar event, or when the user leaves their current location.

iOS 12 will support instant tuning for Notifications right from the lock screen. Users can press into a notification to turn off or adjust a notification directly form the lock screen without having to visit the Settings app, and Siri will also suggest turning off notifications for apps that are very infrequently used. Notifications can also now be grouped by app, topic, and thread, and stacked so that users can tap to look at a particular group, or swipe away a whole group with a single gesture.

A new Screen Time feature will allow users to view reports of how they’ve used their iPhone or iPad, with a full activity report that shows how much time they’re spending, where they’re spending it, and how their use breaks down during the day or night, how many times per hour they’re picking up their iPhone, which apps they’re using during those times, and which apps are sending the more notifications, and even more.

Users can also now set limits to notify them when they’ve been using an app for too long. When reaching the limit, a blocking message will appear, which can still be ignored if users want. All limits will sync across devices via iCloud, and Screen Time can also be used with kids to see activity from kids’ devices, and parents can create allowances to control how often kids spend on their devices, or within certain apps. A “downtime”” mode will force kids to unplug altogether, limiting time in apps by category or individual app. Certain apps can always be allowed regardless of other limits, and content can also be more granularly controlled, such as limiting specific movies, apps, and websites. This will all be able to be managed remotely from the parents’ devices.

Messages is taking Animoji to a new level with “Tongue Detection” that will allow the front camera to emulate tongue gestures on an Animoji. Four new Animoji are also being added in IOS 12 — Ghost, Koala, Tiger, and T-Rex — along with a new “Memory” feature that will allow users to create a personalized Animoji for themselves. The Messages camera will also gain the ability to use Animoji and apply other effects, similar to what’s found in the Clips app.

FaceTime in iOS 12 will gain the ability to create group video calls of up to 32 simultaneous participants. It’s also going to be integrated into Messages, so users will be able to go from a group chat into a group FaceTime call with a single tap, and members of the group can join in and drop out at any time. A tiled display will be used to show all participants, with tiles enlarging and shrinking based on who is talking. Four tiles appear to be shown for the last four people who spoke, while the rest remain in a roster at the bottom. The user can also tap to bring a specific person forward. Effects will also be coming to the FaceTime camera, allowing access to Animoji, Memoji, filters, sticker packs, and so on to be used in a FaceTime call, and users will also be able to answer FaceTime video calls — in audio — on an Apple Watch so that they can participate in a group FaceTime video call.