Research
Research Agenda
Three interconnected research branches validated through Harmony One. Together they form a complete framework for how spatial intelligence systems perceive, adapt, and earn trust.
Multimodal Context Inference
How XR systems perceive and represent human state, task progress, and environment through fused multimodal sensing.
Focuses on continuous, non-intrusive observation of physiological, behavioral, and environmental signals — building a live representation of what the user is doing, how they're doing it, and what they need next. Addresses the signal fusion problem that makes adaptive XR viable at scale.
Cognitive Load-Aware Interface Adaptation
How spatial interfaces change layout, complexity, and visibility to match human attention and cognitive load in real time.
Investigates how the interface layer of an XR system can respond intelligently — reducing complexity when load is high, surfacing context when it's needed, and disappearing when it isn't. Builds on Branch 1's inference signals to drive concrete, measurable interface decisions.
Explainable AI Mediation in Spatial Computing
How AI systems in XR share control with humans in a transparent, trustworthy, and user-governed way.
Addresses the trust gap that prevents AI-driven XR from deploying at scale. Examines how systems communicate their reasoning, how users maintain agency over adaptive behavior, and how explainability requirements shape architecture from the ground up.
Full framework: Harmony: Adaptive Human-Centered Spatial Intelligence for XR + AI Systems · View on Google Scholar ↗