Apple’s software is generally very good. Even as the company has expanded its focus to more platforms than ever – macOS and iOS and iPadOS and tvOS and watchOS and whatever software Apple is using for its maybe-possibly-coming-car and its almost-certainly-upcoming AR/VR develops headset – these platforms continue to excel. It’s been a while since we’ve had an Apple Maps-style fiasco. The biggest mistakes Apple is making right now is much more about placing the Safari URL bar in the wrong part of the screen.
What all of that success and maturity brings, though, is a sense that Apple’s software is… finished — or at least very close. In recent years, the company’s software announcements at WWDC have been almost entirely iterative and additive, with few major variances. For example, the big iOS announcements of the last year have been some FaceTime quality of life improvements and some new types of IDs that work in Apple Wallet. Otherwise, Apple has mostly just introduced new settings menus: new controls for notifications, focus mode settings, privacy tools — that kind of thing.
That’s not a bad thing! Nor is the fact that Apple is the best fast mover in the software business, remarkably quick at adapting and improving everyone else’s new software ideas. Apple devices are as feature-rich, durable, stable, and easy-to-use as anything you’ll find anywhere. Too many companies try to reinvent everything for no reason and end up creating problems where they didn’t exist. Apple is nothing but a ruthlessly efficient machine, and that machine works hard to improve every pixel their devices produce.
The best of iOS 15 in case you forgot.
But we’re at a technological tipping point that will demand more from Apple. It’s pretty clear now that AR and VR are Apple’s next big thing, the next supposedly earth-shattering big industry after smartphones. Apple probably won’t be showing a headset at WWDC, but as augmented and virtual reality move more into our lives, everything about how we experience and interact with technology will have to change.
Apple has of course been showing AR for years. But all that is shown are demos, things that you can see or do on the other side of the camera. We’ve heard very little from the company about how they think AR devices will work and how we’ll use them. The company that loves to rave about its input devices needs some new ones and a new software paradigm to go with them. We’ll see that at WWDC this year.
Remember last year when Apple showed that you could take a picture of a piece of paper with your iPhone and it would automatically scan and recognize any text on the page? Live Text is an AR feature through and through: it’s a way to use your phone’s camera and AI to understand and catalog information in the real world. The entire tech industry thinks this is the future — that’s what Google does with Maps and Lens and Snapchat with its Lenses and filters. Apple needs a lot more where Live Text came from.
From a simple UI perspective, AR needs a much more efficient system to get information and complete tasks. Nobody’s going to wear AR glasses that send them Apple Music ads and message notifications every six minutes, right? And full-screen apps that require your special attention are becoming a thing of the past.
Maybe we’ll get some clues as to what that’s going to be like: It sounds like “Use your phone without getting lost in your phone” will be a theme at this year’s WWDC. Corresponding Bloomberg‘s Mark Gurman, we were able to see an iOS lock screen that displays useful information without having to unlock your phone. A more visible iPhone seems like an excellent idea, and a good way to stop people opening their phone to check the weather, only to find themselves deep in a TikTok hole three and a half hours later. The same goes for the supposed “interactive widgets” that let you do basic tasks without having to open an app. And if Focus mode is rumored to get some improvements – and especially if Apple can make Focus mode easier to set up and use – it could be a really useful tool on your phone and an absolutely must-have tool on your AR glasses.
I would also expect Apple to continue to bring its devices much more closely together in terms of what they do and how they do it, to make the entire ecosystem more user-friendly. With a nearly full line of Macs and iPads running on Apple’s M-chip – and perhaps a whole line after WWDC when the long-awaited Mac Pro finally arrives – there’s no reason for the devices not to share more DNA. Universal Control, probably iOS 15’s most exciting announcement, even though it only shipped in February, is a good example of what it looks like for Apple to treat its many screens as part of an ecosystem. If iOS 16 brings true freeform multitasking to iPad (and boy does I hope it does), an iPad in a keyboard dock is essentially a Mac. Apple used to avoid this proximity; now it seems to embrace it. And when it ultimately considers all of these devices as companions and accessories to an AR pair of glasses, it needs them all to do the job well.
The last time Apple – heck, the last time everyone — had a really new take on how we use gadgets in 2007 when the iPhone came out. Since then, the industry has been on a yes-and-road, improving and tweaking without ever really deviating from the fundamentals of multitouch. But AR will break all of that. It cannot work otherwise. That’s why companies are working on neural interfaces, trying to perfect gesture controls, and figuring out how to display everything from translated text to maps and games on a tiny screen in front of your face. Meta already sends and sells its best ideas; Googles come in the form of Lens features and Sizzle videos. Now Apple needs to start showing the world how it thinks an AR future works. Headset or no headset, that will be the story of WWDC 2022.