Auflistung nach Autor:in "Lubos, Paul"
1 - 4 von 4
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragEdutainment & Engagement at Exhibitions: A Case Study of Gamification in the Historic Hammaburg Model(Mensch und Computer 2016 - Tagungsband, 2016) Haesler, Steffen; Obernesser, Karen; Raupp, Tino; Jahnke, Christoph; Stapf, Jonathan; Bräker, Julia; Lubos, Paul; Bruder, Gerd; Steinicke, FrankGamification in the context of interactive exhibitions has enormous potential to attract visitors and improve their engagement, flow, and learning. This paper describes a case study in which we use game-design elements for an interactive and collaborative exploration of a virtual exhibition. The goal is to collaboratively explore the possibilities of a multiplayer game using different user interfaces and input devices in the same environment. The case study was conducted using a virtual 3D model of the “Hammaburg”, which is a medieval castle of the 9th century. The idea of the multiplayer exhibition consists of a two-player game. One player is using a touch-table or other touch input devices, whereas the other player is using an immersive head-mounted display (HMD), combined with a game controller to navigate through the virtual environment (VE). Both players can interactively explore the VE while playing a mini-game together. We performed a user study to evaluate the game concepts. The results suggest that communication between the players—both spoken and technologically supported—is a challenging task, and seems especially difficult for the HMD player. Furthermore, this paper proposes a more specific exploration of other possible settings focusing on the communication of the players.
- WorkshopbeitragHapRing: A Wearable Haptic Device for 3D Interaction(Mensch und Computer 2015 – Proceedings, 2015) Ariza Nunez, Oscar Javier; Lubos, Paul; Steinicke, FrankHaptic devices have the capability to offer good solutions in terms of usability and accuracy related to touch feedback on immersive virtual environments (IVEs). However, there are very few affordable devices to perform natural interaction in a 3D space and some do not represent a suited solution for the common ergonomic and stimuli-meaningfulness issues. In this article, we present a wireless haptic ring (HapRing) for spatial interaction, providing vibro-tactile signals as well as vibration cues on a finger-basis using a haptic actuator. Other features include inertial measurement, digital input and support for IR camera-based tracking.
- KonferenzbeitragNatural 3D Interaction Techniques for Locomotion with Modular Robots(Mensch und Computer 2015 – Proceedings, 2015) Krupke, Dennis; Lubos, Paul; Bruder, Gerd; Zhang, Jianwei; Steinicke, FrankDefining 3D movements of modular robots is a challenging task, which is usually addressed with computationally expensive algorithms that aim to create self-propelling locomotion. So far only few user interfaces exist which allow a user to naturally interact with a modular robot in real-time. In this paper we present two approaches for baseline research of 3D user interfaces for intuitive manipulation of 3D movements of a modular chain-like robot in the scope of an iterative design process. We present a comparative evaluation of the techniques, which shows that they can provide intuitive human-robot interaction via remote control for real-time guidance of modular robots to move through heavy terrains and pass obstacles. In particular, our results show that steering a robot’s locomotion via rotational hand movements has benefits for challenging locomotion tasks compared to translational hand movements. We discuss the results and present lessons learned for steering user interfaces for modular robots.
- KonferenzbeitragThe Interactive Spatial Surface - Blended Interaction on a Stereoscopic Multi-Touch Surface(Mensch & Computer 2014 - Workshopband, 2014) Lubos, Paul; Garber, Carina; Hoffert, Anjuly; Reis, Ina; Steinicke, FrankCurrent touch technology provides precise and accurate large multi-touch surfaces, which allow multiple users to collaborate in these two-dimensional setups. Furthermore, recent developments in the field of display technology support visualization of three-dimensional (3D) virtual environments (VEs) in high-fidelity visual detail on these surfaces. Finally, cost-efficient depth cameras such as the Microsoft Kinect support affordable hand tracking, which enables interaction above the surface. In this paper we describe the interactive SPAtial SurfaCE (iSPACE), a system combining classical multi-touch-based interaction with direct mid-air selection and manipulation of stereoscopically projected virtual objects. By utilizing a large state-of-the-art Ultra HD 3D display and a high-performance touch frame, the system offers high-quality collaborative exploration.