PolyGNN: Polyhedron-based graph neural network for 3D building reconstruction from point clouds

Zhaiyu Chen, Yilei Shi, Liangliang Nan, Zhitong Xiong, Xiao Xiang Zhu

Research output: Contribution to journalArticlepeer-review

Abstract

We present PolyGNN, a polyhedron-based graph neural network for 3D building reconstruction from point clouds. PolyGNN learns to assemble primitives obtained by polyhedral decomposition via graph node classification, achieving a watertight and compact reconstruction. To effectively represent arbitrary-shaped polyhedra in the neural network, we propose a skeleton-based sampling strategy to generate polyhedron-wise queries. These queries are then incorporated with inter-polyhedron adjacency to enhance the classification. PolyGNN is end-to-end optimizable and is designed to accommodate variable-size input points, polyhedra, and queries with an index-driven batching technique. To address the abstraction gap between existing city-building models and the underlying instances, and provide a fair evaluation of the proposed method, we develop our method on a large-scale synthetic dataset with well-defined ground truths of polyhedral labels. We further conduct a transferability analysis across cities and on real-world point clouds. Both qualitative and quantitative results demonstrate the effectiveness of our method, particularly its efficiency for large-scale reconstructions. The source code and data are available at https://github.com/chenzhaiyu/polygnn.

Original languageEnglish
Pages (from-to)693-706
Number of pages14
JournalISPRS Journal of Photogrammetry and Remote Sensing
Volume218
DOIs
StatePublished - Dec 2024

Keywords

  • 3D reconstruction
  • Building model
  • Graph neural network
  • Point cloud
  • Polyhedron

Fingerprint

Dive into the research topics of 'PolyGNN: Polyhedron-based graph neural network for 3D building reconstruction from point clouds'. Together they form a unique fingerprint.

Cite this