Auflistung nach Schlagwort "visual feedback"
1 - 3 von 3
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragIllumiPaper: Printed Displays for Novel Digital Pen-and-Paper User Interfaces(Mensch und Computer 2017 - Workshopband, 2017) Klamka, Konstantin; Dachselt, RaimundIn this paper, we demonstrate IllumiPaper: A system that combines new forms of paper-integrated feedback with digital pen-and-paper solutions to address the lack of dynamic visual feedback in digital pen applications. We aim to visually support typical paper-related tasks by providing active visual feedback that is directly integrated into standard paper and thereby eliminate the problem of visual inconsistencies between physically written content and associated digital information. In contrast to our prior work, we present an extended version of our augmented paper sheets that are entirely printed with emerging conductive and electroluminescent thin-film technologies. These advanced fabrication methods enable paper illuminations with a high degree of quality and integration. As a major contribution, we want to demonstrate the feasibility of IllumiPapers systematic feedback repertoire for real-world applications and share hands-on experiences with the IllumiPaper research platform consisting of a paper-controller, digital pen and printed illuminated, digitally controlled papers with the German HCI community.
- Conference paperProblem-Specific Visual Feedback in Discrete Modelling(Proceedings of DELFI 2024, 2024) Herwig, Maurice; Hundeshagen, Norbert; Hundhausen, John; Kablowski, Stefan; Lange, MartinDiscrete modelling as the basis of problem solving is an essential skill for computer scientists, but the correct use of formal languages like propositional logic for such purposes remains a big challenge for undergraduate students. The DiMo tool provides support for the acquisition of formal modelling competencies using propositional logic. We extend the tool by generic capabilities to generate problem-specific feedback to students. This allows them to visualise the result of their modelling attempts in terms of the modelled problem at hand, thus helping students to initiate corresponding learning cycles.
- KonferenzbeitragWhat Does My Classifier Learn? A Visual Approach to Understanding Natural Language Text Classifiers(Software Engineering und Software Management 2018, 2018) Winkler, Jonas Paul; Vogelsang, AndreasNeural Networks have been utilized to solve various tasks such as image recognition, text classification, and machine translation and have achieved exceptional results in many of these tasks. However, understanding the inner workings of neural networks and explaining why a certain output is produced are no trivial tasks. Especially when dealing with text classification problems, an approach to explain network decisions may greatly increase the acceptance of neural network supported tools. In this paper, we present an approach to visualize reasons why a classification outcome is produced by convolutional neural networks by tracing back decisions made by the network. The approach is applied to various text classification problems, including our own requirements engineering related classification problem. We argue that by providing these explanations in neural network supported tools, users will use such tools with more confidence and also may allow the tool to do certain tasks automatically.