Capture • Dataset • Evaluation

EgoVerse Human data from around the world, built for robot learning

EgoVerse is an ecosystem for curating, accessing, and learning from human data for robot learning. It hosts a "living" dataset, continuously expanded by the consortium, and driven by a research community advancing rigorous science of human-to-robot transfer across tasks and embodiments.

What EgoVerse Captures

Dense annotations and diverse task coverage

Ready for Policy Learning Includes accurate camera poses, 3D head tracking, and dense language annotation.
Standardized and Open-Ended Tasks EgoVerse contains standardized flagship tasks alongside open-ended scenarios across diverse tasks, scenes, objects, and operators.

Built for Human-to-Robot Transfer

Validated across diverse robot embodiments and tasks

Georgia Tech (RL2 Lab) Task: object-in-container • ID/OOD
Stanford University (REAL Lab) Task: cup-on-saucer • bimanual precision
UC San Diego (Wang Lab) Task: bag-grocery • long-horizon
ETH Zurich (CVG & SRL Labs) Task: bag-grocery • long-horizon

Consortium Partners

Built across academic and industrial partners

Georgia Tech
Stanford
UC San Diego
ETH
Mecka AI
Scale AI
Meta

A Living Dataset

Data from around the world, continuously growing.

Dataset Snapshot

1,362
Hours of human demos
~80k
Episodes
1,965
Tasks
240
Scenes
2,087
Unique demonstrators

Stats from the current release.