Joint Deep Multi-Graph Matching and 3D Geometry Learning from Inhomogeneous 2D Image Collections

Zhenzhang Ye, Tarun Yenamandra, Florian Bernard, Daniel Cremers

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

4 Zitate (Scopus)

Abstract

Graph matching aims to establish correspondences between vertices of graphs such that both the node and edge attributes agree. Various learning-based methods were recently proposed for finding correspondences between image key points based on deep graph matching formulations. While these approaches mainly focus on learning node and edge attributes, they completely ignore the 3D geometry of the underlying 3D objects depicted in the 2D images. We fill this gap by proposing a trainable framework that takes advantage of graph neural networks for learning a deformable 3D geometry model from inhomogeneous image collections, i.e., a set of images that depict different instances of objects from the same category. Experimentally, we demonstrate that our method outperforms recent learning-based approaches for graph matching considering both accuracy and cycle-consistency error, while we in addition obtain the underlying 3D geometry of the objects depicted in the 2D images.

OriginalspracheEnglisch
TitelAAAI-22 Technical Tracks 3
Herausgeber (Verlag)Association for the Advancement of Artificial Intelligence
Seiten3125-3133
Seitenumfang9
ISBN (elektronisch)1577358767, 9781577358763
PublikationsstatusVeröffentlicht - 30 Juni 2022
Veranstaltung36th AAAI Conference on Artificial Intelligence, AAAI 2022 - Virtual, Online
Dauer: 22 Feb. 20221 März 2022

Publikationsreihe

NameProceedings of the 36th AAAI Conference on Artificial Intelligence, AAAI 2022
Band36

Konferenz

Konferenz36th AAAI Conference on Artificial Intelligence, AAAI 2022
OrtVirtual, Online
Zeitraum22/02/221/03/22

Fingerprint

Untersuchen Sie die Forschungsthemen von „Joint Deep Multi-Graph Matching and 3D Geometry Learning from Inhomogeneous 2D Image Collections“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren