Box2Poly: Memory-Efficient Polygon Prediction of Arbitrarily Shaped and Rotated Text

Xuyang Chen, Dong Wang, Konrad Schindler, Mingwei Sun, Yongliang Wang, Nicolo Savioli, Liqiu Meng

Publikation: Beitrag in FachzeitschriftKonferenzartikelBegutachtung

Abstract

Recently, Transformer-based text detection techniques have sought to predict polygons by encoding the coordinates of individual boundary vertices using distinct query features. However, this approach incurs a significant memory overhead and struggles to effectively capture the intricate relationships between vertices belonging to the same instance. Consequently, irregular text layouts often lead to the prediction of outlined vertices, diminishing the quality of results. To address these challenges, we present an innovative approach rooted in Sparse R-CNN: a cascade decoding pipeline for polygon prediction. Our method ensures precision by iteratively refining polygon predictions, considering both the scale and location of preceding results. Leveraging this stabilized regression pipeline, even employing just a single feature vector to guide polygon instance regression yields promising detection results. Simultaneously, the leverage of instance-level feature proposal substantially enhances memory efficiency (> 50% less vs. the SOTA method DPText-DETR) and reduces inference speed (> 40% less vs. DPText-DETR) with comparable performance on benchmarks. The code is available at https://github.com/Albertchen98/Box2Poly.git.

OriginalspracheEnglisch
Seiten (von - bis)1219-1227
Seitenumfang9
FachzeitschriftProceedings of the AAAI Conference on Artificial Intelligence
Jahrgang38
Ausgabenummer2
DOIs
PublikationsstatusVeröffentlicht - 25 März 2024
Veranstaltung38th AAAI Conference on Artificial Intelligence, AAAI 2024 - Vancouver, Kanada
Dauer: 20 Feb. 202427 Feb. 2024

Fingerprint

Untersuchen Sie die Forschungsthemen von „Box2Poly: Memory-Efficient Polygon Prediction of Arbitrarily Shaped and Rotated Text“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren