Open Access
22 October 2020 Guided optimization framework for the fusion of time-of-flight with stereo depth
Author Affiliations +
Abstract

The fusion of depth acquired actively with the depth estimated passively proved its significance as an improvement strategy for gaining depth. This combination allows us to benefit from two sources of modalities such that they complement each other. To fuse two sensor data into a more accurate depth map, we must consider the limitations of active sensing such as low lateral resolution while combining it with a passive depth map. We present an approach for the fusion of active time-of-flight depth and passive stereo depth in an accurate way. We propose a multimodal sensor fusion strategy that is based on a weighted energy optimization problem. The weights are generated as a result of combining the edge information from a texture map and active and passive depth maps. The objective evaluation of our fusion algorithm shows an improved accuracy of the generated depth map in comparison with the depth map of every single modality and with the results of other fusion methods. Additionally, a visual comparison of our result shows a better recovery on the edges considering the wrong depth values estimated in passive stereo. Moreover, the left and right consistency check on the result illustrates the ability of our approach to consistently fuse sensors.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Faezeh Sadat Zakeri, Mårten Sjöström, and Joachim Keinert "Guided optimization framework for the fusion of time-of-flight with stereo depth," Journal of Electronic Imaging 29(5), 053016 (22 October 2020). https://doi.org/10.1117/1.JEI.29.5.053016
Received: 12 May 2020; Accepted: 22 September 2020; Published: 22 October 2020
Lens.org Logo
CITATIONS
Cited by 4 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Volume rendering

Fusion energy

Sensors

Cameras

Reliability

Stereoscopic cameras

Visualization

RELATED CONTENT

Evaluation of a color fused dual-band NVG
Proceedings of SPIE (April 13 2009)
The camera convergence problem revisited
Proceedings of SPIE (May 21 2004)
Minimalist Light Striping for a Mobile Robot
Proceedings of SPIE (March 01 1990)
Visual comfort and apparent depth in 3D systems effects...
Proceedings of SPIE (November 27 2002)

Back to Top