Gibson Environment

Real-World Perception for Embodied Agents

(page under construction -- Gibson Environments is in Beta release. See Github)

Get Started View on Github

Overview


Bridging the Gap between

Real World and Simulation

Perception and Action

"You shouldn't play video games all day, so shouldn't your AI." Gibson is a virtual environment built with real world complexity, using RGBD scanning.

"We must perceive in order to move, but we must also move in order to perceive." --- James J. Gibson Gibson enables training AI algorithms that explores both perception and action.

Get Started

Use our opensource platform to explore active perception. If you find it interesting, we encourage reading about how we made it.

Gibson Platform

Gibson Environment for Real-World Perception Learning. We are opensourced on Github. Check it out , deploy, and start training perceptual agents.

Model Database

Up to 1000 RGBD scanning models, collected in indoor scenes using RGBD camera. State-of-the-art model complexity.

How We Did it

Learn about our deep view synthesis pipeline, physics integration, and experiment results in Gibson Environment.

Domain Adaptation

We propose a domain adaptation mechanism called "Goggle", which ensures that results in Gibson can trasfer to real world.

Team

Amir Zamir

Stanford, UC Berkeley

Fei Xia

Stanford

Zhiyang He

Stanford

Sasha Sax

Stanford

Jitendra Malik

UC Berkeley

Silvio Savarese

Stanford

Paper

Gibson Env: Real-World Perception for Embodied Agents
CVPR 2018. [Spotlight]
F. Xia*, A. R. Zamir*, Z. He*, S. Sax, J. Malik, S. Savarese.
(*equal contribution)
[Paper] [Supplementary] [Code] [Bibtex]