Skip to main content Skip to main navigation

Publikation

Gaze-Based Menu Navigation in Virtual Reality: A Comparative Study of Layouts and Interaction Techniques

László Kopácsi; Albert Klimenko; Abdulrahman Mohamed Selim; Michael Barz; Daniel Sonntag
In: Carmelo Ardito; Simone Diniz Junqueira Barbosa; Tayana Conte; André Freire; Isabela Gasparini; Philippe Palanque; Raquel Prates (Hrsg.). Human-Computer Interaction -- INTERACT 2025. IFIP Conference on Human-Computer Interaction (INTERACT-2025), 20th IFIP TC 13 International Conference, Belo Horizonte, Brazil, September 8–12, 2025, Proceedings, Part I, September 8-12, Belo Horizonte, Brazil, Page 24, ISBN 978-3-032-04999-5, Springer Nature Switzerland, 9/2025.

Zusammenfassung

Integrating eye-tracking technologies in Extended Reality (XR) headsets has enabled intuitive, hands-free system interaction, such as gaze-based menu navigation. However, there is a lack of comprehensive comparisons and consensus in the literature on the optimal use of gaze-based menu navigation. This paper presents a comparative analysis of gaze-based menu navigation in virtual environments, focusing on two common menu layouts: pie and list menus, with three interaction methods: gaze-based dwell, controller-based, and a multimodal approach combining gaze and controller inputs. We conducted a 19-participant within-subject study, measuring task completion time, error rate, usability, and user preference for each condition. The results indicate that while the pie layout was statistically faster and less erroneous than the list layout, novice users tend to favour list layouts. Furthermore, we found that users preferred the multimodal interaction method, despite its lower task completion times and higher error rates compared to controller-based navigation. Based on our findings, we offer design guidelines and recommendations for implementing gaze-based menu systems.

Projekte