Auflistung nach Autor:in "Stellmach, Sophie"
1 - 4 von 4
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragDigitale Stift- und Papierinteraktion in Virtuellen Umgebungen(Mensch & Computer 2010: Interaktive Kulturen, 2010) Stellmach, Sophie; Brücher, Thomas; Franke, Ronny; Dachselt, RaimundDie Interaktion mit digitalen Stiften und Papier stellt ein vertrautes und preisgünstiges Medium für den Umgang mit Computersystemen dar. Die gewohnte Handhabung mit Stift und Papier unterstützt eine natürliche Interaktion auch in virtuellen Welten. So kann die Navigation und Systemkontrolle in virtuellen dreidimensionalen Umgebungen über papierbasierte Paletten erfolgen. Für diesen Zweck stellen wir in diesem Artikel verschiedene Prototypen für diese Aufgaben vor, die auf der Anoto Technologie basieren. Dabei wurden einfache haptische Hilfselemente wie Führungshilfen und Aussparungen verwendet, um eine Benutzung zu unterstützen, für die Benutzer nicht notwendigerweise auf die Palette schauen müssen, um damit zu interagieren. Eine qualitative Benutzerstudie bestätigte den Nutzen solcher Hilfselemente, zeigte aber auch Verbesserungspotential für die Navigation in virtuellen Umge- bungen mittels papierbasierten Interfaces auf.
- WorkshopbeitragGaze-supported Interaction(Mensch & Computer 2012 – Workshopband: interaktiv informiert – allgegenwärtig und allumfassend!?, 2012) Stellmach, Sophie; Dachselt, RaimundConsidering the increasing diversity of display arrangements including wall-sized screens and multi-display setups, our eye gaze provides a particular high potential for implicit and seamless, as well as fast interactions. However, gaze-based interaction is often regarded as error-prone and unnatural, especially when restricting the input to gaze as a single modality. For this reason, we have developed several interaction techniques benefitting from gaze as an additional, implicit and fast pointing modality for roughly indicating a user s visual attention in combination with common smartphones to make more explicit and precise specifications. In our demos, we showcase two examples for more natural and yet effective ways of incorporating a user s gaze as a supporting input modality. The two application scenarios comprise (1) gaze-supported pan-and-zoom techniques using the example of GoogleEarth and (2) gaze-supported navigation and target selection in a virtual 3D scene.
- KonferenzbeitragInvestigating Freehand Pan and Zoom(Mensch & Computer 2012: interaktiv informiert – allgegenwärtig und allumfassend!?, 2012) Stellmach, Sophie; Jüttner, Markus; Nywelt, Christian; Schneider, Jens; Dachselt, RaimundThe availability of low-cost and flexible tracking systems for hand and body movements is increasing. With this, more thorough investigations for more natural and efficient physical interaction styles are required which take particular limitations of such systems into account, such as the limited ability to track individual fingers. To contribute to this, we describe an investigation of basic hand gestures for the exploration of large information spaces. A set of four pan-and-zoom alternatives using two-handed gestural controls have been implemented and compared using Google Earth as an example. For this we conducted a small-scaled formative user study with nine participants to fundamentally assess users' acceptance and the qualification of these freehand gestures for pan-and-zoom operations. As a result, a simple forward and backward hand movement for zooming and a joystick metaphor for panning yielded in the best overall results. Especially the seamless integration of continuous pan and zoom was positively highlighted by participants.
- KonferenzbeitragLAIF: A Logging and Interaction Framework for Gaze- Based Interfaces in Virtual Entertainment Environments(Mensch & Computer 2010 Entertainment Interfaces Track, 2010) Nacke, Lennart; Stellmach, Sophie; Sasse, Dennis; Niesenhaus, Jörg; Dachselt, RaimundEye tracking is a fascinating technology that is starting to be used for evaluation of and for interacting in virtual environments. Especially digital games can benefit from an integrated (i.e., evaluation and interaction) approach, harnessing eye tracking technology for analysis and interaction. Such benefits include faster development of innovative games which can be automatically evaluated in an iterative fashion. For this purpose, we present a framework that enables rapid game development and gameplay analysis within an experimental research environment. The framework presented here is extensible for different kinds of logging (e.g., psychophysiological and in-game behavioral data) and facilitates studies using eye-tracking technology in digital entertainment environments. An experimental study using gaze-only interaction in a digital game is also presented and highlights the framework’s capacity to create and evaluate novel entertainment interfaces.