Abstract
In augmented reality applications, consistent illumination between virtual and real objects is important for creating an immersive user experience. Consistent illumination can be achieved by appropriate parameterisation of the virtual illumination model, that is consistent with real-world lighting conditions. In this study, we developed a method to reconstruct the general light direction from red-green-blue (RGB) images of real-world scenes using a modified VGG-16 neural network. We reconstructed the general light direction as azimuth and elevation angles. To avoid inaccurate results caused by coordinate uncertainty occurring at steep elevation angles, we further introduced stereographically projected coordinates. Unlike recent deep-learning-based approaches for reconstructing the light source direction, our approach does not require depth information and thus does not rely on special red-green-blue-depth (RGB-D) images as input.
Original language | English |
---|---|
Title of host publication | 29th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, WSCG 2021 - Proceedings |
Editors | Vaclav Skala |
Publisher | Vaclav Skala - Union Agency |
Pages | 31-40 |
Number of pages | 10 |
Volume | 3101 |
ISBN (Electronic) | 9788086943343 |
DOIs | |
State | Published - 2021 |
Event | 29th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, WSCG 2021 - Plzen, Czech Republic Duration: 17 May 2021 → 20 May 2021 |
Conference
Conference | 29th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, WSCG 2021 |
---|---|
Country/Territory | Czech Republic |
City | Plzen |
Period | 17/05/21 → 20/05/21 |
Keywords
- Deep learning
- Direction
- Estimation
- Light
- RGB
- Reconstruction
- Source