Indoor Segmentation and Support Inference from RGBD Images
ECCV 2012
Nathan Silberman,
Pushmeet Kohli,
Derek Hoiem,
Rob Fergus
[PDF][Bib][PPT]
Abstract
We present an approach to interpret the major surfaces,
objects, and support relations of an indoor scene from an RGBD image.
Most existing work ignores physical interactions or is applied only to tidy
rooms and hallways. Our goal is to parse typical, often messy, indoor scenes
into floor, walls, supporting surfaces, and object regions, and to recover
support relationships. One of our main interests is to better understand how
3D cues can best inform a structured 3D interpretation. We also contribute a
novel integer programming formulation to infer physical support relations.
We offer a new dataset of 1449 RGBD images, capturing 464 diverse indoor scenes,
with detailed annotations. Our experiments demonstrate our ability to infer
support relations in complex scenes and verify that our 3D scene cues and
inferred support lead to better object segmentation.
Note: The code was written and compiled from the following environment: Red Hat Enterprise Server 6.3, Matlab R2011a, compiled with g++.
Downloads