Abstract
In augmented reality applications, consistent illumination between virtual and real objects is important for creating an immersive user experience. Consistent illumination can be achieved by appropriate parameterisation of the virtual illumination model, that is consistent with real-world lighting conditions. In this study, we developed a method to reconstruct the general light direction from red-green-blue (RGB) images of real-world scenes using a modified VGG-16 neural network. We reconstructed the general light direction as azimuth and elevation angles. To avoid inaccurate results caused by coordinate uncertainty occurring at steep elevation angles, we further introduced stereographically projected coordinates. Unlike recent deep-learning-based approaches for reconstructing the light source direction, our approach does not require depth information and thus does not rely on special red-green-blue-depth (RGB-D) images as input.
Originalsprache | Englisch |
---|---|
Titel | 29th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, WSCG 2021 - Proceedings |
Redakteure/-innen | Vaclav Skala |
Herausgeber (Verlag) | Vaclav Skala - Union Agency |
Seiten | 31-40 |
Seitenumfang | 10 |
Band | 3101 |
ISBN (elektronisch) | 9788086943343 |
DOIs | |
Publikationsstatus | Veröffentlicht - 2021 |
Veranstaltung | 29th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, WSCG 2021 - Plzen, Tschechische Republik Dauer: 17 Mai 2021 → 20 Mai 2021 |
Konferenz
Konferenz | 29th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, WSCG 2021 |
---|---|
Land/Gebiet | Tschechische Republik |
Ort | Plzen |
Zeitraum | 17/05/21 → 20/05/21 |