Apple Vision Pro


The Apple Vision Pro is a head-worn computer developed by Apple. It was announced on June 5, 2023, at Apple's Worldwide Developers Conference and was released first in the US, then in global territories throughout 2024. Apple Vision Pro uses 3D tracking and camera passthrough to give an augmented reality experience of the user's environment. Apple Vision Pro is Apple's first new major product category since the release of the Apple Watch in 2015.
Apple markets Apple Vision Pro as a spatial computer where digital media is integrated with the real world. Physical inputs—such as motion gestures, eye tracking, and speech recognition—can be used to interact with the system. Apple has avoided marketing the device as a virtual reality headset when discussing the product in presentations and marketing.
The device runs visionOS, a mixed-reality operating system derived from iPadOS frameworks using a 3D user interface; it supports multitasking via windows that appear to float within the user's surroundings, as seen by cameras built into the headset. A dial on the top of the headset can be used to mask the camera feed with a virtual environment to increase immersion. The OS supports avatars, which are generated by scanning the user's face; a screen on the front of the headset displays a rendering of the avatar's eyes, which are used to indicate the user's level of immersion to bystanders, and assist in communication.
On October 15, 2025, Apple announced an updated Apple Vision Pro featuring the M5 chip, which delivers improved performance, enhanced display rendering, extended battery life, and support for up to 120 Hz refresh rates. The updated model also introduced the Dual Knit Band, a redesigned headband option designed for improved comfort and fit.

History

Development

In May 2015, Apple acquired the German augmented reality company Metaio, originally spun off from Volkswagen. That year, Apple hired Mike Rockwell from Dolby Laboratories. Rockwell formed a team called the Technology Development Group including Metaio Peter Meier and Apple Watch manager Fletcher Rothkopf. The team developed an AR demo in 2016 but was opposed by chief design officer Jony Ive and his team. Augmented reality and virtual reality expert and former NASA specialist Jeff Norris was hired in April 2017. Rockwell's team helped deliver ARKit in 2017 with iOS 11. Rockwell's team sought to create a headset and worked with Ive's team; the decision to reveal the wearer's eyes through a front-facing eye display was well received by the industrial design team.
The headset's development experienced a period of uncertainty with the departure of Ive in 2019. His successor, Evans Hankey, left the company in 2023. Senior engineering manager Geoff Stahl, who reports to Rockwell, led the development of its visionOS operating system, after previously working on games and graphics technology at Apple. Apple's extended reality headset is meant as a bridge to future lightweight AR glasses, which are not yet technically feasible. In November 2017, Apple acquired Canadian MR company Vrvana, founded by Bertrand Nepveu, for $30 million. The Vrvana Totem was able to overlay fully opaque, true-color animations on top of the real world rather than the ghost-like projections of other AR headsets, which cannot display the color black. It was able to do this while avoiding the often-noticeable lag between the cameras capturing the outside world while simultaneously maintaining a 120-degree field of view at 90 Hz.
Vrvana's innovations, including IR illuminators and infrared cameras for spatial and hand tracking, were integral to the development of the headset. According to leaker Wayne Ma, Apple was originally going to allow macOS software to be dragged from the display to the user's environment, but was scrapped early on due to the limitations of being based on iPadOS and noted that the hand-tracking system was not precise enough for games. Workers also discussed collaborations with brands such as Nike for working out with the headset, and others investigated face cushions that were better suited for sweaty, high-intensity workouts, but was scrapped due to the battery pack and the fragile screen. A feature called "co-presence"; a projection of a FaceTime user's full body, was also scrapped for unknown reasons.

Unveiling and release

In May 2022, Apple's Board of Directors previewed the device. The company began recruiting directors and creatives to develop content for the headset in June. One such director, Jon Favreau, was enlisted to bring the dinosaurs on his Apple TV+ show Prehistoric Planet to life. By April, Apple was also working to bring developers on board to create software and services for the headset. Apple filed over 5,000 patents for technologies which contributed to the development of Apple Vision Pro.
Apple Vision Pro was announced at Apple's 2023 Worldwide Developers Conference on June 5, 2023, to launch in early 2024 in the United States at a starting price of.
On June 6, the day after the announcement, Apple acquired the AR headset startup Mira, whose technology is used at Super Nintendo World's Mario Kart ride. The company has a contract with the United States Air Force and Navy. Eleven of the company's employees were onboarded.
On January 8, 2024, Apple announced that the release date of Apple Vision Pro in the United States would be on February 2, 2024. Estimates of initial shipments ranged from 60,000 to 80,000 units. Pre-orders began on January 19, 2024, at 5:00 a.m. PST and the launch shipments sold out in 18 minutes. Apple sold up to 200,000 units in the two-week pre-order period, a majority of which were to be shipped five to seven weeks after launch day.
It also became available for purchase in China, Hong Kong, Japan, and Singapore on June 28, 2024, in Australia, Canada, France, Germany, and the UK on July 12, 2024, and in South Korea and the UAE on November 15, 2024.

Specifications

Hardware

Apple Vision Pro comprises approximately 300 components. It has a curved laminated glass display on the front, an aluminum frame on its sides, a flexible cushion on the inside, and a removable, adjustable headband. The frame contains five sensors, six microphones, and 12 cameras. Users see two 3660 × 3200 pixel micro-OLED displays with a total of 23 megapixels usually running at 90 FPS through the lens but can automatically adjust to 96 or 100 FPS based on the content being shown. The eyes are tracked by a system of LEDs and infrared cameras, which form the basis of the device's iris scanner named Optic ID. Horizontally mounted motors adjust lenses for individual eye positions to ensure clear and focused images that precisely track eye movements. Sensors such as accelerometers and gyroscopes track facial movements, minimizing discrepancies between the real world and the projected image. Custom optical inserts are supported for users with prescription glasses, which will attach magnetically to the main lens and are developed in partnership with Zeiss. The device's two speakers are inside the headband and are placed in front of the user's ears. It can also virtualize surround sound. Two cooling fans about in diameter are placed near the eye positions to help with heat dissipation due to high-speed processing of data. An active noise control function counters distracting noises, including the fan sounds. During the ordering process, users must scan their face using an iPhone or iPad with Face ID for fitting purposes; this can be done via the Apple Store app or at an Apple Store retail location.
Apple Vision Pro uses the Apple M2 system on a chip. It is accompanied by a co-processor known as Apple R1, which is used for real-time sensor input processing. The device can be purchased with three internal storage configurations: 256 GB, 512 GB, and 1 TB. It is powered by an external battery pack that connects through a locking connector on the left side of the headband, twisting into place. The battery pack connects to the headset using a 12-pin locking variant of the Lightning connector that can be removed with a SIM ejection tool.
The user's face is scanned by the headset during setup to generate a persona—a realistic avatar used by OS features. One such feature is "EyeSight", an outward-facing screen which displays the eyes of the user's persona. Its eyes appear dimmed when in AR and obscured when in full immersion to indicate the user's environmental awareness. When someone else approaches or speaks, even if the user is fully immersed, EyeSight shows their persona's virtual eyes normally and makes the other person visible.
A digital crown dial on the headset is used to control the amount of virtual background occupying the user's field of view, ranging from a mixed-reality view where apps and media appear to float in the user's real-world surroundings, to completely hiding the user's surroundings. It may also alternatively control the device's speaker volume.

Accessories

First-party consumer accessories for Apple Vision Pro include a US$199 travel case, $99 or $149 Zeiss-manufactured lens inserts for users with vision prescriptions, a $199 light seal, and a $29 light seal cushion. The only official third-party accessory available at launch is a battery holder made by Belkin.
A first-party adapter costing $299 is available and can only be purchased by registered, paid Apple Developer accounts, that replaces the right head-strap connection and adds a USB-C port for use by developers. Code from diagnostics tools have revealed that the adapter is capable of interacting with Apple Vision Pro in a diagnostic mode.
In November 2024, it was announced that Apple will sell a Belkin head strap for use with the Solo Knit Band.

Software

Apple Vision Pro runs visionOS, which is derived primarily from iPadOS core frameworks, and MR-specific frameworks for foveated rendering and real-time interaction.
The operating system uses a 3D user interface navigated via finger tracking, eye tracking, and speech recognition. Users can select elements by looking at it and pinching two fingers together, move the element by moving their pinched fingers, and scroll by flicking their wrist. Apps are displayed in floating windows that can be arranged in 3D space. visionOS supports a virtual keyboard for text input, the Siri virtual assistant, and external Bluetooth peripherals including Magic Keyboard, Magic Trackpad, and gamepads. visionOS supports screen mirroring to other Apple devices using AirPlay. visionOS can mirror the primary display of a macOS device via the "Mac Virtual Display" feature; the Mac can also be controlled using peripherals paired with the headset.
visionOS supports vision apps from App Store, and is backward compatible with selected iOS and iPadOS apps; developers are allowed to opt out from visionOS compatibility. Netflix, Spotify, and YouTube notably announced that they would not release visionOS apps at launch, nor support their iOS apps on the platform, and directed users to use their web versions in Safari. Analysts suggested that this may have resulted from the companies' strained relationships with Apple over App Store policies such as mandatory 30% revenue sharing, including associated antitrust allegations. In an interview, Netflix co-CEO Greg Peters stated that Apple Vision Pro was too niche for the company to support at this time, but that "we're always in discussions with Apple to try and figure that out". A YouTube spokesperson later stated to The Verge that the service had plans to develop a visionOS app in the future.