Bridging the Gap between
Real World and Simulation
Perception and Action
"You shouldn't play video games all day, so shouldn't your AI." Gibson is a virtual environment built with real world complexity, using RGBD scanning.
"We must perceive in order to move, but we must also move in order to perceive." --- James J. Gibson Gibson enables training AI algorithms that explores both perception and action.
Use our opensource platform to explore active perception. Above: rendering (b) Physics, (c) RGB, (d) Depth, (e) Semantics and (e) Surface Normal inside (a) 3D model at real time. If you find it interesting, we encourage reading about how we made it.
Up to 1000 RGBD scanning models, collected in indoor scenes using RGBD camera. State-of-the-art model complexity.
Learn about our deep view synthesis pipeline, physics integration, and experiment results in Gibson Environment.
We propose a synthesis mechanism called "Goggle", which ensures that results in Gibson can trasfer to real world.
Checkout our database page for the state-of-the-art 3D model dataset.