Fast depth map compression and meshing with compressed tritree

Michel Sarkis, Waqar Zia, Klaus Diepold

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

18 Zitate (Scopus)

Abstract

We propose in this paper a new method based on binary space partitions to simultaneously mesh and compress a depth map. The method divides the map adaptively into a mesh that has the form of a binary triangular tree (tritree). The nodes of the mesh are the sparse non-uniform samples of the depth map and are able to interpolate the other pixels with minimal error. We apply differential coding after that to represent the sparse disparities at the mesh nodes. We then use entropy coding to compress the encoded disparities. We finally benefit from the binary tree and compress the mesh via binary tree coding to condense its representation. The results we obtained on various depth images show that the proposed scheme leads to lower depth error rate at higher compression ratios when compared to standard compression techniques like JPEG 2000. Moreover, using our method, a depth map is represented with a compressed adaptive mesh that can be directly applied to render the 3D scene.

OriginalspracheEnglisch
TitelComputer Vision, ACCV 2009 - 9th Asian Conference on Computer Vision, Revised Selected Papers
Seiten44-55
Seitenumfang12
AuflagePART 2
DOIs
PublikationsstatusVeröffentlicht - 2010
Veranstaltung9th Asian Conference on Computer Vision, ACCV 2009 - Xi'an, China
Dauer: 23 Sept. 200927 Sept. 2009

Publikationsreihe

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NummerPART 2
Band5995 LNCS
ISSN (Print)0302-9743
ISSN (elektronisch)1611-3349

Konferenz

Konferenz9th Asian Conference on Computer Vision, ACCV 2009
Land/GebietChina
OrtXi'an
Zeitraum23/09/0927/09/09

Fingerprint

Untersuchen Sie die Forschungsthemen von „Fast depth map compression and meshing with compressed tritree“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren