Paper: Silhouette-Assisted 3D Object Instance Reconstruction from a Cluttered Scene

June 17th, 2020

The objective of our work is to reconstruct 3D object instances from a single RGB image of a cluttered scene. 3D object instance reconstruction is an ill-posed problem due to the presence of heavily occluded and truncated objects, and self-occlusions that lead to substantial regions of unseen areas.

Previous works for 3D reconstruction take clues from object silhouettes to carve reconstructed outputs. In this paper, we explore two ways to include silhouette learnable in the network for 3D instance reconstruction from a single cluttered scene image.

To this end, in the first approach, we automatically generate instance-specific silhouettes that are compactly encoded within our network design and used to improve the reconstructed 3D shapes; in the second approach, we find an efficient design to regularize object reconstruction explicitly.

Experimental results on the SUNCG dataset show that our methods have better performance than the state-of-the-art.

Figure 4: The whole network architecture of our work. Input is a single cluttered scene RGB image. Orange branch for instance-centered implicit silhouette estimation and encoding. Blue branch is for Instance-centered feature extraction. Green branch is for bounding boxes encoder. Yellow branch for coarse feature extraction. All the above features are concatenated to a latent feature space. Then shape and pose predictors estimate 3D instance shape and pose separately. Gray branch for explicit silhouette projection generation. Best seen in color.

Li, Lin; Khan, Salman; Barnes, Nick. Silhouette-Assisted 3D Object Instance Reconstruction from a Cluttered Scene. In: The 2nd Workshop on 3D Reconstruction in the Wild (3DRW2019) in conjunction with International Conference of Computer Vision; 2019; Seoul, South Korea. The 2nd Workshop on 3D Reconstruction in the Wild (3DRW2019) in conjunction with International Conference of Computer Vision; 2020. 2080-2088. 

Download the full paper here.

For more information, contact us.


Subscribe to our News via Email

Enter your email address to subscribe and receive notifications of new posts by email.