Stellmach, SophieDachselt, RaimundReiterer, HaraldDeussen, Oliver2017-11-222017-11-222012978-3-486-71990-1https://dl.gi.de/handle/20.500.12116/7723Considering the increasing diversity of display arrangements including wall-sized screens and multi-display setups, our eye gaze provides a particular high potential for implicit and seamless, as well as fast interactions. However, gaze-based interaction is often regarded as error-prone and unnatural, especially when restricting the input to gaze as a single modality. For this reason, we have developed several interaction techniques benefitting from gaze as an additional, implicit and fast pointing modality for roughly indicating a user s visual attention in combination with common smartphones to make more explicit and precise specifications. In our demos, we showcase two examples for more natural and yet effective ways of incorporating a user s gaze as a supporting input modality. The two application scenarios comprise (1) gaze-supported pan-and-zoom techniques using the example of GoogleEarth and (2) gaze-supported navigation and target selection in a virtual 3D scene.engaze inputeye trackingmultimodal interactiongaze-supported interactionremote interactionGaze-supported InteractionText/Workshop Paper