TY - GEN
T1 - Semantics-Controlled Gaussian Splatting for Outdoor Scene Reconstruction and Rendering in Virtual Reality
AU - Schieber, Hannah
AU - Young, Jacob
AU - Langlotz, Tobias
AU - Zollmann, Stefanie
AU - Roth, Daniel
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Advancements in 3D rendering like Gaussian Splatting (GS) allow novel view synthesis and real-time rendering in virtual reality (VR). However, GS-created 3D environments are often difficult to edit. For scene enhancement or to incorporate 3D assets, segmenting Gaussians by class is essential. Existing segmentation approaches are typically limited to certain types of scenes, e.g., "circular"scenes, to determine clear object boundaries. However, this method is ineffective when removing large objects in non-"circling"scenes such as large outdoor scenes.We propose Semantics-Controlled GS (SCGS), a segmentation-driven GS approach, enabling the separation of large scene parts in uncontrolled, natural environments. SCGS allows scene editing and the extraction of scene parts for VR. Additionally, we introduce a challenging outdoor dataset, overcoming the "circling"setup. We outperform the state-of-the-art in visual quality on our dataset and in segmentation quality on the 3D-OVS dataset. We conducted an exploratory user study, comparing a 360-video, plain GS, and SCGS in VR with a fixed viewpoint. In our subsequent main study, users were allowed to move freely, evaluating plain GS and SCGS. Our main study results show that participants clearly prefer SCGS over plain GS. We overall present an innovative approach that surpasses the state-of-the-art both technically and in user experience.
AB - Advancements in 3D rendering like Gaussian Splatting (GS) allow novel view synthesis and real-time rendering in virtual reality (VR). However, GS-created 3D environments are often difficult to edit. For scene enhancement or to incorporate 3D assets, segmenting Gaussians by class is essential. Existing segmentation approaches are typically limited to certain types of scenes, e.g., "circular"scenes, to determine clear object boundaries. However, this method is ineffective when removing large objects in non-"circling"scenes such as large outdoor scenes.We propose Semantics-Controlled GS (SCGS), a segmentation-driven GS approach, enabling the separation of large scene parts in uncontrolled, natural environments. SCGS allows scene editing and the extraction of scene parts for VR. Additionally, we introduce a challenging outdoor dataset, overcoming the "circling"setup. We outperform the state-of-the-art in visual quality on our dataset and in segmentation quality on the 3D-OVS dataset. We conducted an exploratory user study, comparing a 360-video, plain GS, and SCGS in VR with a fixed viewpoint. In our subsequent main study, users were allowed to move freely, evaluating plain GS and SCGS. Our main study results show that participants clearly prefer SCGS over plain GS. We overall present an innovative approach that surpasses the state-of-the-art both technically and in user experience.
KW - Gaussian Splatting
KW - Novel View Synthesis
KW - Semantic Gaussian Splatting
KW - Virtual Reality
UR - http://www.scopus.com/inward/record.url?scp=105002719907&partnerID=8YFLogxK
U2 - 10.1109/VR59515.2025.00056
DO - 10.1109/VR59515.2025.00056
M3 - Conference contribution
AN - SCOPUS:105002719907
T3 - Proceedings - 2025 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2025
SP - 318
EP - 328
BT - Proceedings - 2025 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2025
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 32nd IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2025
Y2 - 8 March 2025 through 12 March 2025
ER -