Auflistung nach Autor:in "Wittmann, Vera"
1 - 3 von 3
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragEffects of position of real-time translation on AR glasses(Mensch und Computer 2020 - Tagungsband, 2020) Rzayev, Rufat; Hartl, Sabrina; Wittmann, Vera; Schwind, Valentin; Henze, NielsAugmented reality (AR) provides users with contextually relevant multimedia content by overlaying it on real-world objects. However, overlaying virtual content on real-world objects can cause occlusion. Especially for learning use cases, the occlusion might result in missing real-world information important for learning gain. Therefore, it is important to understand how virtual content should be positioned relative to the related real-world information without negatively affecting the learning experience. Thus, we conducted a study with 12 participants using AR glasses to investigate the position of virtual content using a vocabulary learning task. Participants learned foreign words shown in the surrounding while viewing translations using AR glasses as an overlay, on the right or below the foreign word. We found that showing virtual translations on top of foreign words significantly decreases comprehension and increase users' task load. Insights from our study inform the design of applications for AR glasses supporting vocabulary learning.
- KonferenzbeitragInfluence of Annotation Media on Proof-Reading Tasks(Mensch und Computer 2023 - Tagungsband, 2023) Schmid, Andreas; Sautmann, Marie; Wittmann, Vera; Kaindl, Florian; Schauhuber, Philipp; Gottschalk, Philipp; Wimmer, RaphaelAnnotating and proof-reading documents are common tasks. Digital annotation tools provide easily searchable annotations and facilitate sharing documents and remote collaboration with others. On the other hand, advantages of paper, such as creative freedom and intuitive use, can get lost when annotating digitally. There is a large amount of research indicating that paper outperforms digital annotation tools in task time, error recall and task load. However, most research in this field is rather old and does not take into consideration increasing screen resolution and performance, as well as better input techniques in modern devices. We present three user studies comparing different annotation media in the context of proof-reading tasks. We found that annotating on paper is still faster and less stressful than with a PC or tablet computer, but the difference is significantly smaller with a state-of-the-art device. We did not find a difference in error recall, but the used medium has a strong influence on how users annotate.
- KonferenzbeitragThe Influence of Participants’ Personality on Quantitative and Qualitative Metrics in Usability Testing(Mensch und Computer 2019 - Tagungsband, 2019) Schmidt, Thomas; Wittmann, Vera; Wolff, ChristianWe present the results of a usability study with 35 participants investigating the influence of personality on various metrics used in usability engineering. We conduct a task based usability test with a website integrating tasks of various difficulty and also measure performance metrics like task completion rate and time on task. We also use standard questionnaire based usability metrics like the System Usability Scale (SUS). Furthermore, we gather qualitative data via open-ended questions and count the number of words as well as the mentions of positive and negative aspects. We measure personality using the well-known big five model, also often referred to as OCEAN model (openness, conscientiousness, extraversion, agreeableness, neuroticism) and three basic needs (need for influence and power, need for recognition and performance, need for security and tranquility). We analyze the relationship between personality and usability metrics via correlations and regression models. We identify multiple significant results and show that in our study the personality correlated with some of the usability metrics we inspected. Extraversion and the need for influence and power show the most and strongest correlations. Furthermore, we also show that regression models based on personality traits can explain up to 37% of the variance in usability metrics. The results have implications for the improvement of the selection process of usability test participants as well as for the interpretation of test results. We discuss these implications and give an outlook on further research in this area.