Why Apple is pushing the term ‘spatial computing’ with its new Vision Pro headset

SAN FRANCISCO (AP) — As Apple’s much-anticipated Vision Pro headset hits store shelves Friday, you’ll likely start seeing more people wearing the futuristic googles that are supposed to bring in the age of “spatial computing.”

It’s an esoteric form of technology that Apple executives and their marketing gurus are trying to push into the mainstream while eschewing other more widely used terms like “augmented reality” and “virtual reality” to describe a product’s transformative powers. which is referred to as. could be as significant as the iPhone that came out in 2007.

“We can’t wait for people to experience the magic,” Apple CEO Tim Cook announced Thursday while discussing the Vision Pro with analysts.

The Vision Pro will be among Apple’s most expensive products at $3,500 – a price point where most analysts predict the company may only sell 1 million devices or less in the first year. But Apple only sold about 4 million iPhones during that device’s first year on the market and now sells more than 200 million of them a year, so it has a history of being a niche product. at first it turns into something that people get excited about. live and work.

If that happens with the Vision Pro, references to spatial computing could become as ingrained in the modern vernacular as mobile and personal computing — two previous technological revolutions in technology that Apple played a key role in creating .

So what is spatial computing? It is a way of describing the intersection between the physical world around us and a virtual world created by technology and at the same time enabling people and machines to harmoniously manipulate objects and spaces. It often incorporates elements of augmented reality, or AR, and artificial intelligence, or AI — two subsets of technology that help make spatial computing possible, said Cathy Hackl, a longtime industry consultant who runs a firm often started working while performing these tasks. on apps for the Vision Pro.

“This is a critical moment,” Hackl said. “Spatial computing will enable devices to understand the world in ways they have never been able to before. It will transform human-to-computer interaction, and eventually all interfaces – whether it’s a car or a watch – will be spatial computing devices.”

As a sign of the excitement surrounding the Vision Pro, more than 600 newly designed apps will be available for use on the headset immediately, according to Apple. The range of apps will include a wide selection of TV networks, video streaming services (although Netflix and Google’s YouTube are notably absent from the list) video games and various educational options. On the work side, video conferencing service Zoom and other companies that provide online meeting tools have built apps for the Vision Pro, too.

But the Vision Pro could reveal another disturbing side of the technology if its use of spatial computing is so powerful that people start to see life differently when they’re not wearing the headset and start believe that life is much more interesting when they are seen through the. protective glasses. That situation could exacerbate the screen addictions that have been endemic since the iPhone’s inception and deepen the isolation that digital dependency usually fosters.

Apple is far from the only prominent technology company working on spatial computing products. In recent years, Google has been working on a three-dimensional video conferencing service called “Project Starline” that draws on “photorealistic” images and a “magic window” so that two people sitting in different cities feel like they are the same. room together. But Starline is not yet widely released. Facebook’s corporate parent, Meta Platforms, has also been selling the Quest headset for years that could be seen as a platform for spatial computing, although that company hasn’t positioned the device that way yet.

In contrast, the company is backing Vision Pro with the marketing prowess and customer loyalty that drives trends.

Although it could be heralded as a breakthrough if Apple realized its vision with Vision Pro, the concept of spatial computing has been around for at least 20 years. In a 132-page research paper on the subject published by the Massachusetts Institute of Technology in 2003, Simon Greenwold made a case for automatically flushing toilets to be a primitive form of spatial computing. Greenwold supported his reasoning by pointing out that the toilet “Senses the user’s movement to trigger a flush” and that “the system’s engagement space is a real human space.”

The Vision Pro is, of course, much more sophisticated than a toilet. One of the most impressive features of the Vision Pro is its high-resolution screens that can play three-dimensional video recordings of events and people to make the encounters seem like they are happening again. Apple has already laid the groundwork to sell the Vision Pro with the ability to record “spatial video” on the premium iPhone 15 models released in September.

Apple’s headset also reacts to user movements through hand and eye gestures in an effort to make the device resemble another piece of human physiology. While wearing the headset, users will be able to use their hands to draw up and arrange a series of virtual computer screens, similar to a scene starring Tom Cruise in the 2002 film, “Minority Report.”

Spatial computing is a technology that is adapting to the user instead of requiring the user to adapt to the technology,” said Hackl. “It’s all supposed to be very natural.”

It remains to be seen how natural it might seem if you’re sitting down to have dinner with someone wearing goggles instead of glancing at their smartphone from time to time.

Leave a Reply

Your email address will not be published. Required fields are marked *