Fast depth map compression and meshing with compressed tritree

Michel Sarkis, Waqar Zia, Klaus Diepold

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

18 Scopus citations

Abstract

We propose in this paper a new method based on binary space partitions to simultaneously mesh and compress a depth map. The method divides the map adaptively into a mesh that has the form of a binary triangular tree (tritree). The nodes of the mesh are the sparse non-uniform samples of the depth map and are able to interpolate the other pixels with minimal error. We apply differential coding after that to represent the sparse disparities at the mesh nodes. We then use entropy coding to compress the encoded disparities. We finally benefit from the binary tree and compress the mesh via binary tree coding to condense its representation. The results we obtained on various depth images show that the proposed scheme leads to lower depth error rate at higher compression ratios when compared to standard compression techniques like JPEG 2000. Moreover, using our method, a depth map is represented with a compressed adaptive mesh that can be directly applied to render the 3D scene.

Original languageEnglish
Title of host publicationComputer Vision, ACCV 2009 - 9th Asian Conference on Computer Vision, Revised Selected Papers
Pages44-55
Number of pages12
EditionPART 2
DOIs
StatePublished - 2010
Event9th Asian Conference on Computer Vision, ACCV 2009 - Xi'an, China
Duration: 23 Sep 200927 Sep 2009

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 2
Volume5995 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference9th Asian Conference on Computer Vision, ACCV 2009
Country/TerritoryChina
CityXi'an
Period23/09/0927/09/09

Fingerprint

Dive into the research topics of 'Fast depth map compression and meshing with compressed tritree'. Together they form a unique fingerprint.

Cite this