EGST: Enhanced Geometric Structure Transformer for Point Cloud Registration

We explore the effect of geometric structure descriptors on extracting reliable correspondences and obtaining accurate registration for point cloud registration. The point cloud registration task involves the estimation of rigid transformation motion in unorganized point cloud, hence it is crucial t...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on visualization and computer graphics Vol. 30; no. 9; pp. 6222 - 6234
Main Authors: Yuan, Yongzhe, Wu, Yue, Fan, Xiaolong, Gong, Maoguo, Ma, Wenping, Miao, Qiguang
Format: Journal Article
Language:English
Published: United States IEEE 01-09-2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We explore the effect of geometric structure descriptors on extracting reliable correspondences and obtaining accurate registration for point cloud registration. The point cloud registration task involves the estimation of rigid transformation motion in unorganized point cloud, hence it is crucial to capture the contextual features of the geometric structure in point cloud. Recent coordinates-only methods ignore numerous geometric information in the point cloud which weaken ability to express the global context. We propose Enhanced Geometric Structure Transformer to learn enhanced contextual features of the geometric structure in point cloud and model the structure consistency between point clouds for extracting reliable correspondences, which encodes three explicit enhanced geometric structures and provides significant cues for point cloud registration. More importantly, we report empirical results that Enhanced Geometric Structure Transformer can learn meaningful geometric structure features using none of the following: (i) explicit positional embeddings, (ii) additional feature exchange module such as cross-attention, which can simplify network structure compared with plain Transformer. Extensive experiments on the synthetic dataset and real-world datasets illustrate that our method can achieve competitive results.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1077-2626
1941-0506
1941-0506
DOI:10.1109/TVCG.2023.3329578