Publication
User-Centric Active Learning through Immersive Visualization
Rida Saghir; Saleem Ahmad; László Kopácsi; Thiago Gouvea; Daniel Sonntag
In: 2026 IEEE Conference Virtual Reality and 3D User Interfaces (VR). IEEE Conference on Virtual Reality and 3D User Interfaces (VR-2026), IEEE, 2026.
Abstract
Understanding data structure is critical for effective human involvement in machine learning, particularly in user-centric active learning, where annotation decisions by humans shape model behaviour. Learned embedding spaces capture this structure, but are typically explored through 2D interfaces. Virtual Reality (VR) offers an opportunity to present these embeddings immersively, supporting richer perception and interaction. We present a VR-based system for user-centric active learning that enables users to explore and annotate samples directly within an immersive embedding space. The system visualizes a continuously updating 3D embedding space derived from an audio classifier and augments it with common active learning cues, including uncertainty, density, diversity, and class coverage through visual encoding. User annotations trigger iterative retraining, allowing the model and visualization to co-evolve in response to human input.
