Apple’s Virtual and Augmented Reality Headset

ar headset

Apple’s Virtual and Augmented Reality Headset

Apple’s upcoming headset will reportedly focus on AR capabilities, rather than VR. But it will also include some virtual reality features, a technology that’s often referred to as mixed reality.

Reports have claimed the headset could cost up to $3,000. And rumors suggest it may include iris scanning technology to verify users’ identity.

Optical Sensors

Most current AR headsets rely on optical tracking systems to estimate the 6-degree-of-freedom pose (x,y,z) and 3-degree-of-freedom orientation (roll, pitch, yaw). They employ different kinds of sensors such as Time of Flight Cameras, Binocular Depth Sensors, and Structured Light. Some of them require markers to be placed on the scene to guide the headset’s cameras.

New sensors are bringing down the size, weight, and power consumption of these systems, enabling the construction of low-cost, high-performance AR headsets. For example, startup AdHawk Microsystems uses a lower-power laser that scans the scene in front of the user’s eyes thousands of times per second and captures reflected light with a simple sensor. The company claims that its system can track head movements to within 1 degree of accuracy and operates at up to 500 Hz, which is significantly faster than the headset’s inertial measurement unit (IMU) sensors.

Other sensor technologies are also making it possible for augmented reality to be used in more real-world applications. For instance, a prototype of an AR headset from Tilt Five incorporates a waveguide that enables the display of virtual images on a physical object, such as a model castle on a game board. The headset’s optical path is designed to ensure that the image remains in focus even when the user moves his or her head.

LiDAR Sensors

Unlike optical cameras, which need a view of the user ar headset to track hand movements and facial features, LiDAR sensors can operate in complete darkness. They can also scan faster, delivering 3-D data in milliseconds. This is especially useful in analyzing and mapping environments that change quickly, such as forest canopy or city streets.

LiDAR sensors can record the return of each pulse to create a point cloud and determine a surface’s elevation. This information is then interpreted to identify different classes such as ground, vegetation (low, medium, high), buildings and water. Some systems also flag points that fall into more than one category. The American Society for Photogrammetry and Remote Sensing assigns a set of codes to each class for easy reference.

This data is helpful in identifying surface type, enabling features like paved roads and parking lots to stand out in the point cloud. It can also help pinpoint a location when there are no visible landmarks nearby, such as when navigating an unfamiliar town or city. LiDAR has even helped researchers identify a lost network of Mayan roads deep in Central America.

Apple’s headset will have two Mac-level chips that will provide unprecedented computing power for a wearable device. It will be able to operate independently of an iPhone or Mac, though it will have the ability to pair with both. xrOS will run the headset and will feature an iOS-like interface that will feel familiar to Apple users. It will have a Home screen with app ar headset icons that can be rearranged, and it will support a physical keyboard and mouse when paired with a Mac.

Cameras

The augmented reality headset’s cameras will help the device sense its position in space, which is known as Simultaneous Localization and Mapping (SLAM). This technology will make it possible to place computer-generated graphics on top of existing scenes without needing special markers. Early AR headsets used barcode-like markers or predetermined 2D images to guide the placement of graphical elements.

The headset’s cameras will also be able to help users find objects inside closed boxes or behind occlusions. Apple is reportedly using a combination of technologies, including new antenna designs, wireless signal processing algorithms, and AI-based fusion of different sensors. The company is also incorporating an H2 chip in the headset to provide ultra-low latency for video streaming and audio transmission.

In addition to providing an immersive AR experience, the augmented reality headset’s cameras will support features such as FaceTime and Skype video calls. They will also allow customers to create and use augmented reality apps that can be controlled with Siri voice commands. Apple is reportedly developing a software platform that will run the headset, which is called xrOS.

XrOS is reportedly optimized for wireless data transmission, compressing and decompressing videos, and power efficiency. It will also feature a dedicated ISP for image processing, which is important to prevent image distortion and reduce the amount of memory required.

Audio

Apple’s AR headset is expected to feature a high-resolution display, six degrees of freedom (6DoF) tracking, and a powerful processor. It will be able to run iOS apps and access the Metaverse, a virtual world where users can interact with 3D graphics and characters. The headset will also include a battery pack that will provide 2 hours of usage.

A small dial on the side of the headset will be used to switch between virtual reality and the physical world. The headset will feature a custom streaming codec and an ISP that can translate distorted images captured by external cameras into a faithful video representation of the user’s surroundings with ultra-low latency. The headset will also have a pair of microphones to support audio capture and output. It is expected to support the new second-generation AirPods Pro for better spatial audio.

The headset will use Apple silicon to provide performance on par with a Mac, making it a stand-alone device that doesn’t require an iPhone or iPad for operation. It will also include accessibility settings designed for people with visual or other impairments.

Apple is said to be working on a special version of the ARKit software that will be used on the headset. The operating system has been referred to internally as “reality OS” and xrOS, with references to the latter appearing in App Store upload logs spotted by eagle-eyed developers.