Auflistung nach Schlagwort "interaction"
1 - 7 von 7
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragAn Active Tangible Device for Multitouch-Display Interaction(Mensch und Computer 2019 - Tagungsband, 2019) Brauer, Christoph; Ariza, Oscar; Steinicke, FrankIn this article, we introduce an interactive tangible input device (TID) for touchscreens. Our approach complements a passive TID design by active microcontroller-driven features focusing on user-experience aspects. The TID provides battery-powered wireless operation, autonomous position sensing capabilities, visual and tactile feedback as well as multiple touch inputs and momentary buttons. The device can be accurately tracked in capacitive touchscreens, enabling novel interaction techniques for content selection and manipulation in 2D or stereoscopic tabletop environments, mid-air interaction is also supported by the use of IMU and short-to-mid-range distance sensors. Overall, we present a multi-purpose device which can be built using off-the-shelf-components, features a seamless firmware integration, Unity3D integration, and a 3D-printable body enclosure.
- KonferenzbeitragA Design Space for User Interface Elements using Finger Orientation Input(Mensch und Computer 2021 - Tagungsband, 2021) Vogelsang, Jonas; Kiss, Francisco; Mayer, SvenDespite touchscreens being used by billions of people every day, today’s touch-based interactions are limited in their expressiveness as they mostly reduce the rich information of the finger down to a single 2D point. Researchers have proposed using finger orientation as input to overcome these limitations, adding two extra dimensions – the finger’s pitch and yaw angles. While finger orientation has been studied in-depth over the last decade, we describe an updated design space. Therefore, we present expert interviews combined with a literature review to describe the wide range of finger orientation input opportunities. First, we present a comprehensive set of finger orientation input enhanced user interface elements supported by expert interviews. Second, we extract design implications as a result of the additional input parameters. Finally, we introduce a design space for finger orientation input.
- muc: langbeitrag (vorträge)Dynamic Gaussian Force Field Controlled Kalman Filtering For Pointing Interaction(Mensch & Computer 2013: Interaktive Vielfalt, 2013) van de Camp, Florian; Stiefelhagen, RainerAs human computer interaction is extending from the desk to the whole room, modalities allowing for distant interaction become more important. Distant interaction however, is inherently inaccurate. Assisting technologies, like force fields, sticky targets, and target expansion have been shown to improve pointing tasks. We present a new variant of force fields that are modeled using Gaussian distributions, which makes placement and configuration as well as overlap handling straight forward. In addition, the force fields are dynamically activated by predicting targets, to allow for natural and fluent movements. Results from a user study show, that the dynamic Gaussian fields can speed up the time needed to click a button with a pointing gesture by up to 60%.
- KonferenzbeitragExploring Big Data Landscapes with Elastic Displays(Mensch und Computer 2017 - Workshopband, 2017) Kammer, Dietrich; Keck, Mandy; Müller, Mathias; Gründer, Thomas; Groh, RainerIn this paper, we propose a concept to help data analysts to quickly assess parameters and results of cluster algorithms. The presentation and interaction on a flexible display makes it possible to grasp the functioning of algorithms and focus on the data itself. Two interaction concepts are presented, which demonstrate the strength of elastic displays: a layer concept that allows the recognition of differences between various parameter settings of cluster algorithms, and a Zoomable User Interface, which encourages the in-depth analysis of clusters.
- muc: langbeitrag (vorträge)Look without Feel -A Basal Gap in the Multi-Touch Prototyping Process(Mensch & Computer 2013: Interaktive Vielfalt, 2013) Freitag, Georg; Wegner, Michael; Tränkner, Michael; Wacker, MarkusPrototyping a user interface is an important workflow step to establish the look und feel of an application in early development. We discuss a model for this process and show that, currently, it is heavily skewed toward the look aspect. This could prove to be a problem when designing highly interactive natural user interfaces, which put a stronger emphasis on the feel of an application. In order to thoroughly analyze this gap we compare eight current prototyping tools, by using a multi-touch application scenario. From this evaluation we derive requirements for a tool more suited towards multi-touch prototyping.
- WorkshopbeitragShaping Sounds - A Vision for Tangible Music Interaction(2013) Walther, Sebastian; Müller, Mathias; Brade, Marius; Groh, RainerIn this article the idea of music composing with an elastic tabletop including tangible objects in its surface is described. An elastic display offers several physical shapes representing different music samples for interaction. The shapes can be activated and manipulated by the user to arrange the associated samples. The key benefits of this form of interaction are the intuitive use, the ability to playfully explore music and the expressiveness of the physical representation of sound.
- KonferenzbeitragTowards a Universal Human-Computer Interaction Model for Multimodal Interactions(Mensch und Computer 2021 - Tagungsband, 2021) Faltaous, Sarah; Gruenefeld, Uwe; Schneegass, StefanModels in HCI describe and provide insights into how humans use interactive technology. They are used by engineers, designers, and developers to understand and formalize the interaction process. At the same time, novel interaction paradigms arise constantly introducing new ways of how interactive technology can support humans. In this work, we look into how these paradigms can be described using the classical HCI model introduced by Schomaker in 1995. We extend this model by presenting new relations that would provide a better understanding of them. For this, we revisit the existing interaction paradigms and try to describe their interaction using this model. The goal of this work is to highlight the need to adapt the models to newinteraction paradigms and spark discussion in the HCI community on this topic.