Poster

GenSDF: Two-Stage Learning of Generalizable Signed Distance Functions

Gene Chou · Ilya Chugunov · Felix Heide

Hall J #109

Keywords: [ generalization ] [ Implicit neural representations ] [ Meta-Learning ] [ signed distance functions ] [ 3d object reconstruction ]

[ Abstract ]
[ Poster [ OpenReview
Thu 1 Dec 2 p.m. PST — 4 p.m. PST
 
Spotlight presentation: Lightning Talks 6A-3
Thu 8 Dec 6 p.m. PST — 6:15 p.m. PST

Abstract:

We investigate the generalization capabilities of neural signed distance functions (SDFs) for learning 3D object representations for unseen and unlabeled point clouds. Existing methods can fit SDFs to a handful of object classes and boast fine detail or fast inference speeds, but do not generalize well to unseen shapes. We introduce a two-stage semi-supervised meta-learning approach that transfers shape priors from labeled to unlabeled data to reconstruct unseen object categories. The first stage uses an episodic training scheme to simulate training on unlabeled data and meta-learns initial shape priors. The second stage then introduces unlabeled data with disjoint classes in a semi-supervised scheme to diversify these priors and achieve generalization. We assess our method on both synthetic data and real collected point clouds. Experimental results and analysis validate that our approach outperforms existing neural SDF methods and is capable of robust zero-shot inference on 100+ unseen classes. Code can be found at https://github.com/princeton-computational-imaging/gensdf

Chat is not available.