Dexter Yang ᯅノ🌐🔗🧙🏻‍♂️👾
banner
dexteryy.bsky.social
Dexter Yang ᯅノ🌐🔗🧙🏻‍♂️👾
@dexteryy.bsky.social
?/acc | Building Spatial/Modern Web, Web Platform/SDK product lead at PICOXR OS. Previously founder & tech leader of Web Dev Engine at ByteDance. Made metaverse apps & TTRPG tools | https://linktr.ee/dexteryy
That falls under accessibility: provide downgraded alternatives, like head movement for eye gaze, or ray-based hand tracking. Another option is going back to classic XR controllers (like AVP + PS VR2 controller)
October 2, 2025 at 4:18 AM
According to visionOS interaction guidelines, far eye–hand input = indirect interaction, near touch input = direct interaction, both are "natural interactions." The OS handles them and maps both to the same spatial gesture events, so developers can write one set of code to support both.
October 2, 2025 at 3:04 AM
3. Multiple videos or livestreams can be watched and interacted with at the same time, including cross-room interactions.
October 1, 2025 at 4:41 PM
What OS architecture and development paradigm shifts follow?

Which New Web APIs are essential for the mainstream Web to get a boarding pass to such devices?
September 5, 2025 at 7:21 AM
we need new Web APIs like WebSpatial to bring the new XR development paradigm to the Web. Only then can web-based solutions have a real shot at replacing native apps in many scenarios, especially in AI-related use cases and those that blend with mobile real-world environments.
August 7, 2025 at 3:30 PM
It also breaks away from mainstream web development, missing out on another big Web advantage (the significantly larger pool of web developers compared to 3D developers).

Just like native spatial apps on visionOS don't prioritize using 3D engines but instead adopt 2D frameworks like SwiftUI,
August 7, 2025 at 3:30 PM
This kind of heavy, all-in immersive experience not only fails to leverage Web's strengths (like URL-based on-demand access) and just repeats the limitations of native XR apps (where each app creates its own isolated space instead of blending into the existing space).
August 7, 2025 at 3:30 PM
Then step into the shadows. Join us. We're recruiting in San Jose: jobs.bytedance.com/en/mobile/po...
June 18, 2025 at 3:40 PM
it's still a 3D engine that uses WebGL to render everything - 2D and 3D - on the canvas or dual-eye screens. It doesn’t support visionOS's Shared Space. Its development mindset is based on 3D graphics APIs - unfamiliar to most web developers who think in terms of 2D GUIs. See the doc for details.
github.com
June 11, 2025 at 1:20 PM
I like R3F. But on one hand, R3F is like React Native - uses React-like syntax but isn't real React, so it's disconnected from the regular HTML/CSS-based React code used on standard websites. It needs its own ecosystem and can't integrate with the mainstream React world.

On the other hand,
June 11, 2025 at 1:20 PM
How can Web Apps have this kind of capability too, and achieve the UI demonstrated in my first three screenshots: bsky.app/profile/dext...
Don't miss this talk at AWE on June 11 - we'll showcase some WebSpatial app demos.

🎟️ awexr.com/usa-2025/age...

WebSpatial is an open-source React SDK that lets you turn regular HTML/CSS-based websites into spatial apps on Vision Pro.

💻 github.com/webspatial/w...
June 11, 2025 at 11:45 AM
where the TikTok camera interface becomes the default Home screen. App distribution wouldn't just rely on icon grids, but also come from the context of the live environment (as part of multimodal input). Unless glasses or lightweight headsets completely replace phones before that happens.
June 11, 2025 at 11:45 AM
The physical screen's background is inevitably trending toward dynamic content - XR headsets also use solid, opaque screens with cameras, so many of their functions are also possible on smartphones.

Its final form might look like a "TikTok OS,"
June 11, 2025 at 11:45 AM
After the iPhone GUI switched to Liquid Glass, the Home/Launcher screen background could be a live camera feed instead of a static wallpaper, making the phone feel almost transparent (though sadly, they didn't roll out that feature this time).
June 11, 2025 at 11:45 AM
So the Glass Material / Liquid Glass not only affects where GUI software can be used and what it can do, but also impacts the whole system architecture and how apps are developed.
June 11, 2025 at 11:45 AM
Android XR can't really do this. It only performs alpha blending. The same limitation keeps its Home Space from mixing 2D and 3D apps the way visionOS Shared Space does (bsky.app/profile/dext...).
Android XR Quick Facts

It's not the same type of XR OS as visionOS, but rather a continuation of the Oculus-style XR OS.

(1/9)
June 11, 2025 at 11:45 AM
- pre-raster layout info in browser engines, ECS data in 3D engines, node graph data for building shaders - so the OS can render all apps and environments together (bsky.app/profile/dext...).
At Siggraph Asia, the head of PICO foundation engineering gave a detailed presentation on the "unified rendering" capability that visionOS has but Android XR doesn't, which I mentioned before. Here's the full text of what was shared: developer.picoxr.com/news/multi-a...
June 11, 2025 at 11:45 AM