Advertisement


A series of reports in The Information paint a detailed picture of the progress, policies, and issues facing Apple’s plan to develop a virtual, augmented, or mixed reality headset since the initiative gained momentum in 2015.
Citing several people familiar with the product, including some who worked on it directly, the reports describe a dispute of wills over the direction of the device. The standoff was between Apple’s mixed reality product team (dubbed the “Technology Development Group”) and famed Apple designer Jony Ive and his industrial design team. The report sheds light on Apple’s direction for the device, which Bloomberg recently reported is close to launch.
They also allege that Apple CEO Tim Cook has been relatively distant from the product compared to others like the iPhone, and that the technology development group’s location in a separate office from Apple’s headquarters was a source of problems and frustration.
The Information’s sources say that Apple’s mixed reality efforts began almost by accident when the company bought a German AR startup called Metaio to use some of its technology for Project Titan, its self-driving car project. Another key moment was when Apple hired AR/VR project team leader Mike Rockwell outside of Dolby Laboratories. Beginning in 2015, Rockwell built a team that included Metaio co-founder Peter Meier and Apple Watch executive Fletcher Rothkopf.
In 2016, Apple board members were shown several AR demos. In one, a tiny triceratops grew to life size before the eyes of the board members. In another, a room has been transformed into an immersive, green environment. But the board wasn’t Rockwell and was the company’s biggest barrier. According to The Information, it was Ive who led both the industrial design and human interface teams at Apple.
Ive and his crew argued against a VR headset because they believe VR separates users from the people and the world around them, and that VR headsets look outdated. But the Technology Development Group enlisted the support of the industrial design team by introducing a concept: an outward-facing screen on the front of the headset that showed those around it images of the wearer’s facial expressions and eyes. The wearer could see people around them through an external camera feed.
Rockwell and his colleagues developed and released ARKit in 2017, an application development suite that allowed developers to create AR apps for iPhone and iPad using technologies and techniques that could later be adapted to a headset.
Initially, Rockwell and the rest of the mixed reality team wanted the headset to be tethered to a base station to provide maximum immersive visuals and performance, and some on the team envisioned it being primarily a tool for professionals and creatives should be for use at their desks. But I didn’t like either of those ideas and wanted it to be a mass-market lifestyle product that consumers could take with them on the go. Apple’s top leadership supported Ive’s plan, and Ive still plays an active role in the development of the headset, although he now works with Apple as a consultant.
The decision to make the headset a standalone device reportedly caused a significant headache. For example, some felt it would be best to cram more features onto a single chip to make it work well on its own. However, with the silicon work already complete, they had to find ways to combat the latency associated with multiple chips in the device communicating. They had also developed software in the belief that the plan for the base station would go ahead.
Nevertheless, the device is in the final phase of development. Bloomberg claimed last week that Apple’s board of directors was recently shown an enhanced version of the product and that Apple has “ramped up” development of the headset software, which is an iOS fork called rOS. (The R stands for “reality”.)
The Information’s coverage reveals many details about the upcoming headset. It would feature at least 4K resolution for each eye, which the team felt was the bare minimum so users wouldn’t perceive the image as pixelated, unlike most current consumer VR headsets. Its built-in processor would be closely related to the M2 processor, which is expected to hit Macs and iPads in the coming months.
The headset would also have 14 cameras, some facing outwards and some facing inwards. It would allow users to see the outside world and allow people nearby to see a video representation of the user’s eyes. It would track the wearer’s facial and body movements live to map to a 3D avatar (probably similar to the iPhone’s Memoji) that could be used for remote meetings and social gatherings with other distant headset wearers.
Due to the limited processing capabilities of the M2 chip in a headset without a tethered base station (the canceled base station is said to have the ultra-high performance M1 Ultra), the avatars would be cartoonish. The sources of the information also say that more photorealistic avatars were tried when the base station was part of the plan, but the uncanny valley was a problem.
Apple originally planned to launch the headset in 2019, but it now looks like it could be announced either later this year or in 2023 instead. Also, Apple plans to introduce more natural AR glasses as a follow-up product, but that device may still be years away from shipping.