Gibson Environment

Real-World Perception for Embodied Agents

(page under construction -- Gibson Environments is in Beta release. See Github)

Get Started View on Github


Bridging the Gap between

Real World and Simulation

Perception and Action

"You shouldn't play video games all day, so shouldn't your AI." Gibson is a virtual environment built with real world complexity, using RGBD scanning.

"We must perceive in order to move, but we must also move in order to perceive." --- James J. Gibson Gibson enables training AI algorithms that explores both perception and action.

Get Started

Use our opensource platform to explore active perception. Above: rendering (b) Physics, (c) RGB, (d) Depth, (e) Semantics and (e) Surface Normal inside (a) 3D model at real time. If you find it interesting, we encourage reading about how we made it.

Gibson Platform

Gibson Environment for Real-World Perception Learning. We are opensourced on Github. Check it out , deploy, and start training perceptual agents.

Model Database

Up to 1000 RGBD scanning models, collected in indoor scenes using RGBD camera. State-of-the-art model complexity.

How We Did it

Learn about our deep view synthesis pipeline, physics integration, and experiment results in Gibson Environment.

Transfer to Real World

We propose a synthesis mechanism called "Goggle", which ensures that results in Gibson can trasfer to real world.

Checkout our database page for the state-of-the-art 3D model dataset.


Amir Zamir

Stanford, UC Berkeley

Fei Xia


Zhiyang He


Sasha Sax


Jitendra Malik

UC Berkeley

Silvio Savarese



Gibson Env: Real-World Perception for Embodied Agents
CVPR 2018. [Spotlight]
F. Xia*, A. R. Zamir*, Z. He*, S. Sax, J. Malik, S. Savarese.
(*equal contribution)
[Paper] [Supplementary] [Code] [Bibtex]