Skip to main content Skip to main navigation

Publication

Towards a Framework for Whole Body Interaction with Geospatial Data

Florian Daiber; Johannes Schöning; Antonio Krüger
In: David England (Hrsg.). Whole Body Interaction. Pages 197-207, Human-Computer Interaction Series, ISBN 978-0-85729-433-3, Springer London, 2011.

Abstract

Since 6,000 years humans have used maps to navigate through space and solve other spatial tasks. Nearly at all times maps were drawn or printed on a piece of paper (or on material like stone or papyrus) of a certain size. Nowadays maps can be displayed on a wide range of electronic devices starting from small screen mobile devices or highly interactive large multi-touch screens. Due to common computer power Geographic Information Systems (GIS) are allowing a rich set of operations on spatial data. However, most GIS require a high degree of expertise from its users, making them difficult to be operated by laymen. In this work we discuss the possibilities of navigating maps using physical (whole body) gestures to easily perform typical basic spatial tasks within GIS (e.g. pan-, zoom- and selection-operations). We studied multi-modal interaction with large- and mid-scale displays by using multi-touch, foot and gaze input. We are interested in understanding how non-expert users interact with such multi-touch surfaces. Therefore, we provide a categorization and a framework of multi-touch hand gestures for interacting with GIS. The combination of multi-touch gestures with a small set of foot gestures to solve geospatial tasks leads to an extended framework for multi-touch and foot input. In an additional step this framework is extended again with eye gaze input.

Weitere Links