Mensch und Computer 2015
Gemeinsam - Arbeit - Erleben
06. bis 09. September 2015 in Stuttgart, zur Konferenzwebseite
Auflistung nach:
Auflistung Mensch und Computer 2015 nach Schlagwort "3D User Interface"
1 - 2 von 2
Treffer pro Seite
Sortieroptionen
- Workshopbeitrag3D User Interfaces for Interactive Annotation of Vascular Structures(Mensch und Computer 2015 – Proceedings, 2015) Saalfeld, Patrick; Glaßer, Sylvia; Preim, BernhardA large number of scientific documents show additional information through annotations, which can also be used in interactive 3D environments. Here, labels are usually placed on an image plane and do not benefit from the 3D domain to convey, e.g., depth cues. These are especially important for spatially complex structures, such as the Circle of Willis, which is the central part of the cerebral vessel system. We present an approach for diegetic annotations, i.e., labels which are part of the 3D world. Based on the wide space for possible label positions and orientations, we enable the user to interactively create and position labels. For this, we present a concept for a 3D User Interface (3D UI) setup in a semi-immersive and a fully-immersive environment. Furthermore, we describe an evaluation design to evaluate the different setups.
- KonferenzbeitragNatural 3D Interaction Techniques for Locomotion with Modular Robots(Mensch und Computer 2015 – Proceedings, 2015) Krupke, Dennis; Lubos, Paul; Bruder, Gerd; Zhang, Jianwei; Steinicke, FrankDefining 3D movements of modular robots is a challenging task, which is usually addressed with computationally expensive algorithms that aim to create self-propelling locomotion. So far only few user interfaces exist which allow a user to naturally interact with a modular robot in real-time. In this paper we present two approaches for baseline research of 3D user interfaces for intuitive manipulation of 3D movements of a modular chain-like robot in the scope of an iterative design process. We present a comparative evaluation of the techniques, which shows that they can provide intuitive human-robot interaction via remote control for real-time guidance of modular robots to move through heavy terrains and pass obstacles. In particular, our results show that steering a robot’s locomotion via rotational hand movements has benefits for challenging locomotion tasks compared to translational hand movements. We discuss the results and present lessons learned for steering user interfaces for modular robots.