Auflistung i-com Band 19 (2020) Heft 2 nach Titel
Treffer pro Seite
Sortieroptionen
- ZeitschriftenartikelappRaiseVR – An Evaluation Framework for Immersive Experiences(i-com: Vol. 19, No. 2, 2020) Wienrich, Carolin; Gramlich, JohannaVR is evolving into everyday technology. For all diverse application areas, it is essential to understand the user’s condition to ensure a safe, pleasant, and meaningful VR experience. However, VR experience evaluation is still in its infancy. The present paper takes up this research desideratum by conflating diverse expertise and learnings about experience evaluation in general and VR experiences in particular into a systematic evaluation framework (appRaiseVR).
Method. To capture diverse expertise, we conducted two focus groups (bottom-up approach) with experts working in different fields of experience evaluation (e. g., Movie Experience, Theatre Experiences). First, we clustered the results of both focus groups. Then, we conflated those results and the learnings about experience evaluation stemming from the field of user experience into the final framework (top-down approach).
Results. The framework includes five steps providing high-level guidance through the VR evaluation process. The first three steps support the definition of the experience and evaluation conditions (setting, level, plausibility). The last two steps guide the selection to find an appropriate time course and tools of measure.
Conclusion. appRaiseVR offers high-level guidance for evaluators with different expertise and contexts. Finally, establishing similar evaluation procedures might contribute to safe, pleasant, and meaningful VR experiences.
- ZeitschriftenartikelCommunicating Robotic Help Requests: Effects of Eye-Expressions, LED-Lights and Polite Language(i-com: Vol. 19, No. 2, 2020) Westhoven, Martin; Grinten, Tim van derIn this paper we report results from a web- and video-based study on the perception of a request for help from a robot head. Colored lights, eye-expressions and politeness of the used language were varied. We measured effects on expression identification, hedonic user experience, perceived politeness, and help intention. Additionally, sociodemographic data, a ‘face blindness’ questionnaire, and negative attitudes towards robots were collected to control for possible influences on the dependent variables. A total of n = 139 participants were included in the analysis. In this paper, the focus is placed on interaction effects and on the influence of covariates. Significant effects were found for the interaction of LED lighting and eye-expressions and for language and eye-expressions on help intention. The expression identification is significantly influenced by the interaction of LED lighting and eye-expressions. Several significant effects of the covariates were found, both direct and from interaction with independent variables. Especially the negative attitudes towards robots significantly influence help intention and perceived politeness. The results provide information on the effect of different design choices for help requesting robots.
- ZeitschriftenartikelHow Can I Grab That?: Solving Issues of Interaction in VR by Choosing Suitable Selection and Manipulation Techniques(i-com: Vol. 19, No. 2, 2020) Weise, Matthias; Zender, Raphael; Lucke, UlrikeThe selection and manipulation of objects in Virtual Reality face application developers with a substantial challenge as they need to ensure a seamless interaction in three-dimensional space. Assessing the advantages and disadvantages of selection and manipulation techniques in specific scenarios and regarding usability and user experience is a mandatory task to find suitable forms of interaction. In this article, we take a look at the most common issues arising in the interaction with objects in VR. We present a taxonomy allowing the classification of techniques regarding multiple dimensions. The issues are then associated with these dimensions. Furthermore, we analyze the results of a study comparing multiple selection techniques and present a tool allowing developers of VR applications to search for appropriate selection and manipulation techniques and to get scenario dependent suggestions based on the data of the executed study.
- ZeitschriftenartikelInvestigating the Relationship Between Emotion Recognition Software and Usability Metrics(i-com: Vol. 19, No. 2, 2020) Schmidt, Thomas; Schlindwein, Miriam; Lichtner, Katharina; Wolff, ChristianDue to progress in affective computing, various forms of general purpose sentiment/emotion recognition software have become available. However, the application of such tools in usability engineering (UE) for measuring the emotional state of participants is rarely employed. We investigate if the application of sentiment/emotion recognition software is beneficial for gathering objective and intuitive data that can predict usability similar to traditional usability metrics. We present the results of a UE project examining this question for the three modalities text, speech and face. We perform a large scale usability test (N = 125) with a counterbalanced within-subject design with two websites of varying usability. We have identified a weak but significant correlation between text-based sentiment analysis on the text acquired via thinking aloud and SUS scores as well as a weak positive correlation between the proportion of neutrality in users’ voice and SUS scores. However, for the majority of the output of emotion recognition software, we could not find any significant results. Emotion metrics could not be used to successfully differentiate between two websites of varying usability. Regression models, either unimodal or multimodal could not predict usability metrics. We discuss reasons for these results and how to continue research with more sophisticated methods.
- ZeitschriftenartikelMixed Reality based Collaboration for Design Processes(i-com: Vol. 19, No. 2, 2020) Hube, Natalie; Müller, Mathias; Lapczyna, Esther; Wojdziak, JanDue to constantly and rapidly growing digitization, requirements for international cooperation are changing. Tools for collaborative work such as video telephony are already an integral part of today’s communication across companies. However, these tools are not sufficient to represent the full physical presence of an employee or a product as well as its components in another location, since the representation of information in a two-dimensional way and the resulting limited communication loses concrete objectivity. Thus, we present a novel object-centered approach that compromises of Augmented and Virtual Reality technology as well as design suggestions for remote collaboration. Furthermore, we identify current key areas for future research and specify a design space for the use of Augmented and Virtual Reality remote collaboration in the manufacturing process in the automotive industry.
- ZeitschriftenartikelNew Digital Realities – Blending our Reality with Virtuality(i-com: Vol. 19, No. 2, 2020) Steinicke, Frank; Wolf, KatrinNew digital reality as a spectrum of technologies and experiences that digitally simulate and extend reality in one way or another across different human senses has received considerable attention in recent years. In particular, we have witnessed great advances in mixed reality (MR) technologies, such as Virtual Reality (VR) and Augmented Reality (AR) technology, which provide enormous potential for application domains like training, simulation, education, entertainment, health, and sports. However, also other forms of digitally enhanced reality (XR) supports novel forms of immersion and experiences while generating, visualizing and interacting with digital content either displayed in fully-immersive virtual environments or superimposed into our view of the real world, and will significantly change the way we work, travel, play, and communicate. Consequently, we face dramatic changes in interactive media creation, access, and perception. In this special issue, we solicit work that addresses novel interaction design, interfaces, and implementation of new digital reality in which our reality is blended with the virtuality with a focus on users’ needs, joy, and visions.
- ZeitschriftenartikelThe Shared View Paradigm in Asymmetric Virtual Reality Setups(i-com: Vol. 19, No. 2, 2020) Horst, Robin; Klonowski, Fabio; Rau, Linda; Dörner, RalfAsymmetric Virtual Reality (VR) applications are a substantial subclass of multi-user VR that offers not all participants the same interaction possibilities with the virtual scene. While one user might be immersed using a VR head-mounted display (HMD), another user might experience the VR through a common desktop PC. In an educational scenario, for example, learners can use immersive VR technology to inform themselves at different exhibits within a virtual scene. Educators can use a desktop PC setup for following and guiding learners through virtual exhibits and still being able to pay attention to safety aspects in the real world (e. g., avoid learners bumping against a wall). In such scenarios, educators must ensure that learners have explored the entire scene and have been informed about all virtual exhibits in it. According visualization techniques can support educators and facilitate conducting such VR-enhanced lessons. One common technique is to render the view of the learners on the 2D screen available to the educators. We refer to this solution as the shared view paradigm. However, this straightforward visualization involves challenges. For example, educators have no control over the scene and the collaboration of the learning scenario can be tedious. In this paper, we differentiate between two classes of visualizations that can help educators in asymmetric VR setups. First, we investigate five techniques that visualize the view direction or field of view of users (view visualizations) within virtual environments. Second, we propose three techniques that can support educators to understand what parts of the scene learners already have explored (exploration visualization). In a user study, we show that our participants preferred a volume-based rendering and a view-in-view overlay solution for view visualizations. Furthermore, we show that our participants tended to use combinations of different view visualizations.