San Francisco — With Apple’s highly anticipated Vision Pro headphones hitting stores on Friday, you’ll probably start to see more people wearing the futuristic glasses that are supposed to usher in the era of “spatial computing.”
It’s an esoteric mode of technology that Apple executives and their marketing gurus are trying to bring into the mainstream. This while avoiding other more commonly used terms such as “augmented reality” and “virtual reality” to describe the transformative powers of a product that is promoted as potentially monumental like the iPhone that came onto the market in 2007.
“We can’t wait for people to experience the magic,” Apple CEO Tim Cook gushed Thursday while discussing the Vision Pro with analysts.
The Vision Pro will also be among Apple’s most expensive products with a price tag of $3,500, a price that has most analysts predicting the company could sell only 1 million or fewer devices during its first year. But Apple only sold about 4 million iPhones during that device’s first year on the market and now sells more than 200 million of them annually, so there’s a history of what initially appears to be a niche product becoming something that is entangled in the way people live and work.
If that happens with the Vision Pro, references to spatial computing could become as ingrained in the modern vernacular as mobile and personal computing, two previous technological revolutions that Apple played an integral role in creating.
So what is spatial computing? It is a way of describing the intersection between the physical world around us and a virtual world manufactured by technology, while allowing humans and machines to harmoniously manipulate objects and spaces. Performing these tasks often incorporates elements of augmented reality, or AR, and artificial intelligence, or AI, two subsets of technology that are helping make spatial computing a reality, said Cathy Hackl, a longtime industry consultant. who now runs a startup working on applications for Vision Pro.
“This is a crucial moment,” Hackl said. “Spatial computing will allow devices to understand the world in a way they have never been able to before. It is going to change the interaction between humans and computers and eventually every interface (be it a car or a watch) will become spatial computing devices.”
In a sign of the excitement surrounding the Vision Pro, more than 600 newly designed apps will be available to use on the headset right away. according to Apple. The range of apps will include a wide selection of television networks, streaming video services (although Netflix and Google’s YouTube are notably absent from the list), video games and various educational options. On the work front, video conferencing service Zoom and other companies that offer online meeting tools have also created apps for Vision Pro.
But Vision Pro could expose another troubling side of the technology if its use of spatial computing is so compelling that people start to see the world differently when they’re not wearing headphones and start to believe that life is much more interesting when they do. They see through the glasses. . That scenario could worsen screen addictions that have become endemic since the iPhone’s debut and deepen the isolation that digital dependency tends to cultivate.
Apple is far from the only prominent technology company working on spatial computing products. For the past few years, Google has been working on a three-dimensional vision. video conferencing service called “Project Starline” which is based on “photorealistic” images and a “magic window” to make two people sitting in different cities feel as if they were in the same room together. But Starline has not yet been widely released. Facebook’s parent company, Meta Platforms, has also been selling Quest headsets for years that could be seen as a platform for spatial computing, although that company has so far not positioned the device that way.
Vision Pro, on the other hand, is backed by a company with the marketing prowess and customer loyalty that tend to spark trends.
Although it could be considered a breakthrough if Apple realizes its vision with the Vision Pro, the concept of spatial computing has been around for at least 20 years. In a 132 page research paper On the subject published in 2003 by the Massachusetts Institute of Technology, Simon Greenwold argued that self-flushing toilets were a primitive form of spatial computing. Greenwold supported his reasoning by noting that the toilet “senses the user’s movement to activate the flush” and “the system’s participation space is a real human space.”
The Vision Pro, of course, is much more sophisticated than a toilet. One of the most attractive features of the Vision Pro is its high-resolution displays that can play back three-dimensional video recordings of events and people to make it appear as if the encounters are happening again. Apple has already laid the groundwork for selling Vision Pro by including the ability to record what it calls “space video” in premium iPhone 15 models released in September.
Apple’s headphones also react to the user’s hand gestures and eye movements in an attempt to make the device seem like another piece of human physiology. While wearing the headset, users will also be able to use just their hands to lift and arrange a series of virtual computer screens. similar to a scene with Tom Cruise in the 2002 film “Minority Report.”
Spatial computing “is a technology that is starting to adapt to the user rather than requiring the user to adapt to the technology,” Hackl said. “Everything is supposed to be very natural.”
It remains to be seen how natural it may look if you’re sitting at dinner with someone else wearing the glasses instead of intermittently checking their smartphone.