ResXR

An open-source, end-to-end toolkit for conducting behavioral XR experiments on Meta standalone head-mounted displays

Addressing technical barriers and data standardization in XR research

About ResXR

The Challenge

Widespread adoption of XR in behavioral research is hindered by high technical barriers and the absence of standardized data formats. Creating immersive experiments demands specialized programming skills, and researchers lack accessible templates designed for scientific applications.

The Solution

ResXR provides a Unity-based experiment template for multimodal data capture alongside a Python processing pipeline that automates validation, preprocessing, and quality reporting, inspired by established neuroimaging tools like fMRIPrep.

Target Platform

Meta standalone headsets (Quest 2, Quest Pro, Quest 3)

Output Format

Motion-BIDS compatible data for reproducible research

ResXR Architecture

ResXR comprises a Unity-based base template and a Python-based data pipeline

Stage 1

Base Template

Available

Unity-based experiment template for multimodal data capture

  • Head, hand, eye, and face tracking
  • ResXRPlayer, ResXREyeTracker, ResXRDataManager components
  • Automatic trial logging and synchronized data collection
Demo Experiments Included:
  • Binary Choice (2AFC paradigm with paired visual stimuli)
  • Maze Navigation (spatial wayfinding task with trajectory capture)
  • Museum Viewing (free-exploration gallery for eye-tracking studies)
Stage 2

Data Pipeline

Available

Quality checks, preprocessing, reports, and export

  • Configurable quality checks (e.g., tracking loss detection)
  • Quality flags generation
  • Automated validation pipeline
  • Excludes flagged segments
  • Computes derived measures
  • Automated preprocessing workflow
  • Comprehensive HTML reports with diagnostic plots
  • Analysis-ready export
  • Visual quality assurance reports
Supported by
ERC Minerva Center for Mixed Realities Minerva Stiftung (MPG) Samueli