Foveated rendering

Foveated rendering is a visual rendering technique designed to improve the image quality within the area you are currently looking at. It works by closely mimicking the natural function of the human eye, which sees objects in the center of the gaze more clearly than those in the peripheral vision. It is sometimes more specifically referred to as ‘dynamic foveated rendering’. More on this later.

Rendering high-quality images where the viewer's gaze is focused while reducing the quality in peripheral areas, rather than the entire screen like a traditional television, optimizes processing resources used to render images. This in turn can increase frame rates, extend the battery life, and reduce heat on head mounted devices, amongst other performance benefits. It's a key spatial computing innovation that is enabling far more realistic and immersive digital environments within the constraints of current hardware.

What devices support foveated rendering?

Whilst the idea of foveated rendering is over 30 years old, it has only been recently available on virtual reality and spatial computing devices. Current devices available that support foveated rendering include:

  • Apple Vision Pro
  • HTC Vive Pro Eye
  • Meta Quest Pro
  • Playstation VR 2
  • Varjo Aero, XR-3, and XR-4

Fixed v Dynamic Foveated Rendering

Dynamic Foveated Rendering, which we previously described, uses eye tracking to define a specific gaze focus area. This is distinguished from Fixed Foveated Rendering, an older, less advanced version of foveated rendering that has been in existence for a long time in VR hardware. 

Fixed foveated rendering is generally used on lower-cost VR devices like the Meta Quest 2 and 3, and Pico 4. Without gaze information from eye tracking, fixed foveated rendering is limited to optimizing rendering from an arbitrary fixed point based on the head tracking of the device. 

Spatial Applications 

Foveated rendering in VR has significantly enhanced user experiences, both in gaming and practical everyday applications. Consider the impact on an application like Microsoft Flight Simulator, which requires huge computing resources to render detailed environment details. Foveated rendering makes it possible to increase frame rates and resolution. 

In VR gaming, titles like "Half-Life: Alyx" and "No Man Sky” have showcased the technology's impact. Foveated rendering brings intricate environmental details and complex interactions to life, offering players a vivid and immersive experience. Gamers especially notice sharper visuals and smoother performance in the areas where they focus their gaze, contributing to a more engaging gaming experience.

A breakthrough application of foveated rendering is seen in the Apple Vision Pro. Here, the rich passthrough video experience is indistinguishable from natural vision, allowing you to view and engage with real life tools or content, such as reading text on a computer or seeing and engaging colleagues in the same room. The passthrough, combined with foveated rendering, ensures high clarity in focused areas while smoothly rendering the periphery, thus maintaining a natural and comfortable visual experience.

Augmented Reality and Passthrough

In augmented reality (AR) there are two major ways to blend digital images with real world vision: passthrough and projection-based AR. Foveated rendering is starting to play a crucial role in enhancing this passthrough feature. 

Passthrough captures vision of the real world using cameras on the device and displays it inside the headset or on screens, overlaying digital elements onto this real-time feed. This allows users to see and interact with their physical surroundings while engaging with virtual spatial content. Content in passthrough can be incredibly rich and dense, and take advantage of high frame rates for realistic rendering quality. This richness is partially enabled by the foveated rendering which optimizes the computer processing required and realistic.

In contrast, projection-based AR tends to be used on heads-up displays (HUD) to project data directly onto a transparent surface within the user's line of sight, such as a car windshield or headset like Microsoft’s HoloLens 2. Projection-based AR content tends to be simple data readouts (think speedometer on a car) or text instructions overlaid on top of content.

By concentrating high-quality rendering in the user's gaze area and reducing it in the periphery, foveated rendering allows devices to provide more detailed and responsive digital overlays without overwhelming the device's processing capabilities.

Bring spatial computing to your business

Book a demo
  • 1 on 1 guided tour with a JigSpace expert
  • Personalized to your business
  • Choose a time and date that suits you
Get started