Show simple item record

dc.contributor.authorStellmach, Sophiede_DE
dc.contributor.authorDachselt, Raimundde_DE
dc.contributor.editorReiterer, Harald
dc.contributor.editorDeussen, Oliver
dc.date.accessioned2017-11-22T14:58:48Z
dc.date.available2017-11-22T14:58:48Z
dc.date.issued2012
dc.identifier.isbn978-3-486-71990-1
dc.identifier.urihttp://dl.gi.de/handle/20.500.12116/7723
dc.description.abstractConsidering the increasing diversity of display arrangements including wall-sized screens and multi-display setups, our eye gaze provides a particular high potential for implicit and seamless, as well as fast interactions. However, gaze-based interaction is often regarded as error-prone and unnatural, especially when restricting the input to gaze as a single modality. For this reason, we have developed several interaction techniques benefitting from gaze as an additional, implicit and fast pointing modality for roughly indicating a user s visual attention in combination with common smartphones to make more explicit and precise specifications. In our demos, we showcase two examples for more natural and yet effective ways of incorporating a user s gaze as a supporting input modality. The two application scenarios comprise (1) gaze-supported pan-and-zoom techniques using the example of GoogleEarth and (2) gaze-supported navigation and target selection in a virtual 3D scene.de_DE
dc.language.isoende_DE
dc.publisherOldenbourg Verlag
dc.relation.ispartofMensch & Computer 2012 – Workshopband: interaktiv informiert – allgegenwärtig und allumfassend!?
dc.subjectgaze inputde_DE
dc.subjecteye trackingde_DE
dc.subjectmultimodal interactionde_DE
dc.subjectgaze-supported interactionde_DE
dc.subjectremote interactionde_DE
dc.titleGaze-supported Interactionde_DE
dc.typesysdemode_DE
dc.pubPlaceMünchen
mci.document.qualitydigidoc
mci.reference.pages489-492de_DE
mci.conference.sessiontitleinter|aktion Demosessionde_DE


Files in this item

Thumbnail
Thumbnail

Show simple item record