Auflistung nach Schlagwort "Usability testing"
1 - 2 von 2
Treffer pro Seite
Sortieroptionen
- BarcampAK Barrierefreiheit | Barrierefreie UX 2: Methodik(Mensch und Computer 2024 - Usability Professionals, 2024) González Mellídez, Beatriz; Zimmermann, Gottfried; Ableitner, Tobias; Koch, Sebastian; Vaupel, Oliver; Bursy, Miriam; Wilkens, Leevke; Baumann, Lukas; Heitplatz, Vanessa; Kramschneider, Luisa; Dirks, Susanne; Maikowski, Nico; Sammet, Marcus; Bittenbinder, Sven; Clüver, Claudius; Müller, Claudia; Farnetani, StefanKurze Präsentationen mit Diskussion rund um Barrierefreieheit im UX Bereich mit Fokus auf Methodik: Bedienungshilfen, UX und Barrierefreiheitsstrategien Das Ziel des Austauschforum Digitale Barrierefreiheit ist eine Plattform zum Austausch anzubieten. 1. Verbesserung der digitalen Teilhabe mit Android-Bedienungshilfen. 2. Wizard-of-Oz Usability-Tests für Barrierefreiheit. Einführung und Praxistest einer einfachen Methode, um Barrieren in Design-Mockups und Wireframes zu finden. 3. Inklusive Technologieentwicklung. Überprüfung der System Usability Scale in Leichter Sprache für inklusive Usability Testungen. 4. Barrierefreie Organisationspläne für Behörden: HTML statt PDF. 5. Onboarding für barrierefreies Miteinander im Beruf. Sicherer Erlebens- und Aushandlungsraum zur Förderung der Inklusion neuer Mitarbeitenden. 6. Strategische Vorteile durch Reifegradmodelle
- ZeitschriftenartikelAutomated interpretation of eye–hand coordination in mobile eye tracking recordings(KI - Künstliche Intelligenz: Vol. 31, No. 4, 2017) Mussgnug, Moritz; Singer, Daniel; Lohmeyer, Quentin; Meboldt, MirkoMobile eye tracking is beneficial for the analysis of human–machine interactions of tangible products, as it tracks the eye movements reliably in natural environments, and it allows for insights into human behaviour and the associated cognitive processes. However, current methods require a manual screening of the video footage, which is time-consuming and subjective. This work aims to automatically detect cognitive demanding phases in mobile eye tracking recordings. The approach presented combines the user’s perception (gaze) and action (hand) to isolate demanding interactions based upon a multi-modal feature level fusion. It was validated in a usability study of a 3D printer with 40 participants by comparing the usability problems found to a thorough manual analysis. The new approach detected 17 out of 19 problems, while the time for manual analyses was reduced by 63%. More than eye tracking alone, adding the information of the hand enriches the insights into human behaviour. The field of AI could significantly advance our approach by improving the hand-tracking through region proposal CNNs, by detecting the parts of a product and mapping the demanding interactions to these parts, or even by a fully automated end-to-end detection of demanding interactions via deep learning. This could set the basis for machines providing real-time assistance to the machine’s users in cases where they are struggling.