Essential Points
- visionOS 26.4 beta 3 (build 23O5225f) shipped March 2, 2026, available to developers with a free Apple account
- Foveated streaming via NVIDIA CloudXR enables high-resolution, low-latency immersive content on Apple Vision Pro
- Apple Podcasts gains HLS video support, enabling adaptive streaming and dynamic ad insertion for video podcasts
- All major features arrived in beta 1 on February 16, 2026; betas 2 and 3 focus on stability
Apple shipped visionOS 26.4 beta 3 on March 2, 2026, carrying a feature that changes how Vision Pro receives high-resolution streamed content from remote computers and servers. Foveated streaming, introduced in beta 1 and now stabilizing across the beta cycle, lets developers pipe graphics-intensive environments directly from cloud or desktop hardware to the headset, without requiring the Vision Pro chip to render them locally. This piece covers exactly what is confirmed, how the technology works, and what developers need to do before public release.
Foveated Streaming: What Apple Actually Confirmed
Apple’s official visionOS 26.4 release notes state: “visionOS 26.4 supports NVIDIA CloudXR with foveated streaming, enabling apps to display high-resolution, low-latency immersive content on Apple Vision Pro.” This is not a general rendering improvement to the OS. It is a developer-facing framework that apps and games must explicitly adopt to deliver the benefit to users.
The mechanism works by sending full-quality video only to the area where the user is actively looking, compressing everything in peripheral vision where the eye cannot detect fine detail. Apple’s documentation describes a hybrid rendering model: native spatial content, such as a cockpit dashboard or UI controls, renders locally on Vision Pro via RealityKit, while processor-intensive environments like outdoor landscapes or large game worlds stream from a remote computer. A two-way messaging channel lets visionOS apps share custom data with the streaming endpoint, allowing developers to adjust streamed content in real time using a native SwiftUI interface.
Apple’s own example from developer documentation is precise: a racing game can render interior gauges with RealityKit locally, while streaming the outdoor track environment from a remote server. A flight simulator can render a cockpit natively and stream the landscape. Both examples share the same principle: keep latency-sensitive UI local, offload heavy rendering to a remote machine.
Apple Podcasts HLS Support: Confirmed Details
visionOS 26.4 beta 1 also introduced support for Apple Podcasts’ new video features, specifically HLS-based video podcast delivery. HLS (HTTP Live Streaming) is the adaptive streaming standard already used by major video platforms. For podcast publishers, this opens the door to dynamic ad insertion without re-encoding episode files. For Vision Pro users, adaptive bitrate means the stream degrades gracefully on slower connections rather than buffering and breaking spatial immersion.
Apple has not published granular release notes detailing which specific HLS capabilities are scoped to visionOS versus iOS or iPadOS. The confirmed fact is that HLS video podcast support is part of the visionOS 26.4 feature set introduced at beta 1.
Beta Cycle Timeline: All Three Builds Confirmed
| Beta | Build Number | Release Date | Key Changes |
|---|---|---|---|
| Beta 1 | 23O5209m | February 16, 2026 | Foveated streaming via NVIDIA CloudXR, HLS Apple Podcasts video support |
| Beta 2 | 23O5220e | February 23, 2026 | Stability refinements; no major new features reported |
| Beta 3 | 23O5225f | March 2, 2026 | Continued refinements; no major new features reported |
All three build numbers are confirmed against Apple’s official developer releases page and verified third-party sources.
What Developers Must Do Before Public Release
If you build apps for visionOS, three confirmed action items apply now:
- Implement the Foveated Streaming framework using FoveatedStreamingSession to connect to an external streaming source and display content inside a FoveatedStreamingSpace
- Layer native spatial content over streamed content using RealityKit, keeping UI and responsive elements local to the device
- Test HLS video delivery pipelines if your app surfaces video content through Apple Podcasts or similar frameworks
A free Apple Developer account is required to install the beta via Settings, General, Software Update on Apple Vision Pro.
visionOS 26.4 Within the Broader visionOS 26 Arc
visionOS 26 launched September 15, 2025, introducing spatial widgets, redesigned Personas with volumetric rendering, PSVR2 Sense controller support, 180-degree and 360-degree video from action cameras like GoPro and Insta360, and shared spatial experiences for multiple Vision Pro users in the same room. visionOS 26.4 continues Apple’s pattern of using point releases to expand developer infrastructure rather than add consumer-visible features.
Foveated streaming specifically targets the gap between Vision Pro’s onboard M-series chip capability and the demands of PC-grade or cloud-rendered VR content. By supporting NVIDIA CloudXR, Apple opens a direct integration path for existing VR games and experiences already running on CloudXR infrastructure, which runs on other platforms. This lowers the porting barrier significantly for developers who have VR content built for desktop or cloud hardware.
Considerations
visionOS 26.4 delivers no consumer-visible changes users can experience without app updates that implement the new APIs. Foveated streaming benefits are entirely dependent on third-party developer adoption. Apple has not confirmed a public release date for visionOS 26.4. Additionally, Apple has not published a detailed changelog for beta 3, making it impossible to confirm what specific fixes or refinements it contains beyond stability work.
watchOS 26.4 Beta 3 Arrives With Build 23T5226e: Full Breakdown for Apple Watch Developers
Frequently Asked Questions (FAQs)
What is visionOS 26.4 beta 3’s build number?
The build number for visionOS 26.4 beta 3 is 23O5225f, released March 2, 2026. Beta 2 carried build 23O5220e, released February 23, 2026. Beta 1 was build 23O5209m, released February 16, 2026. All three are confirmed on Apple’s official developer releases page.
What is foveated streaming in visionOS 26.4?
visionOS 26.4 supports NVIDIA CloudXR with foveated streaming, enabling apps to stream high-resolution, low-latency immersive content to Apple Vision Pro. The framework renders full quality only at the user’s focal point, compresses peripheral areas, and allows native spatial content to layer over streamed environments using RealityKit.
Does visionOS 26.4 beta 3 add new features over beta 2?
No major new features have been reported in beta 3. The headline features, foveated streaming via NVIDIA CloudXR and HLS Apple Podcasts video support, both arrived in beta 1 on February 16, 2026. Beta 2 and beta 3 address stability and refinements.
When will visionOS 26.4 publicly release?
Apple has not announced a public release date for visionOS 26.4. Based on the confirmed three-beta cadence and Apple’s prior 26.x update timelines, a release candidate and public release are anticipated in the coming weeks.
How does HLS support in visionOS 26.4 benefit Apple Podcasts users on Vision Pro?
HLS (HTTP Live Streaming) enables adaptive bitrate delivery for video podcasts in Apple Podcasts, so playback quality adjusts automatically based on connection speed. It also enables dynamic ad insertion for publishers. For Vision Pro users, this reduces buffering interruptions in video or spatial podcast content.
Do I need a paid Apple Developer account to test visionOS 26.4 beta 3?
No. A free Apple Developer account is sufficient to access the beta. Enroll at developer.apple.com, then navigate to Settings, General, Software Update, Beta Updates on your Apple Vision Pro to install it.
What is NVIDIA CloudXR and why does it matter for Vision Pro?
NVIDIA CloudXR is a platform that streams VR and XR content from powerful remote computers or cloud servers to headsets. By supporting CloudXR in visionOS 26.4, Apple allows VR games and experiences already built for CloudXR on other platforms to stream directly to Apple Vision Pro, lowering the development effort required to bring PC-grade VR content to the headset.

