Sahami Shirazi, AlirezaFunk, MarkusPfleiderer, FlorianGlück, HendrikSchmidt, AlbrechtReiterer, HaraldDeussen, Oliver2017-11-222017-11-222012978-3-486-71879-9https://dl.gi.de/handle/20.500.12116/7794Adding notes to time segments on a video timeline makes it easier to search, find, and play-back important segments of the video. Various approaches have been explored to annotate videos (semi) automatically to summarize videos. In this research we investigate the feasi-bility of implicitly annotating videos based on brain signals retrieved from a Brain-Computer Interface (BCI) headset. The signals provided by the BCI can reveal different infor¬mation such as brain activities, facial expressions, or the level of users excitement. This in¬formation correlates with scenes the users watch in a video. Thus, it can be used for anno¬tating a video and automatically generating a summary. To achieve the goal, an annotation tool called MediaBrain is developed and a user study is conducted. The result reveals that it is possible to annotate a video and select a set of highlights based on the excitement information.enbrainvideoannotationBCIimplicit/explicit interactionMediaBrain: Annotating Videos based on Brain-Computer InteractionText/Conference Paper