Auflistung nach Autor:in "Sahami Shirazi, Alireza"
1 - 3 von 3
Treffer pro Seite
Sortieroptionen
- muc: langbeitrag (vorträge)Automatic Classification of Mobile Phone s Contacts(Mensch & Computer 2013: Interaktive Vielfalt, 2013) Sahami Shirazi, Alireza; Le, Huy Viet; Henze, Niels; Schmidt, AlbrechtCurrent smartphones have virtually unlimited space to store contact information. Users typically have dozens or even hundreds of contacts in their address book. The number of contacts can make it difficult to find particular contacts from the linear list provided by current phones. Grouping contacts ease the retrieval of particular contacts and also enables to share content with specific groups. Previous work, however, shows that users are not willing to manually categorize their contacts. In this paper we inves-tigate the automatic classification of contacts in phones contact lists, using the user s communication history. Potential contact groups were determined in an online survey with 82 participants. We collect-ed the call and SMS communication history from 20 additional participants. Using the collected data we trained a machine-learning algorithm that correctly classified 59.2% of the contacts. In a pilot study in which we asked participants to review the results of the classifier we found that 73.6% of the re-viewed contacts were considered correctly classified. We provide directions to further improve the performance and argue that the current results already enable to ease the manual classification of mo-bile phone contacts.
- KonferenzbeitragMediaBrain: Annotating Videos based on Brain-Computer Interaction(Mensch & Computer 2012: interaktiv informiert – allgegenwärtig und allumfassend!?, 2012) Sahami Shirazi, Alireza; Funk, Markus; Pfleiderer, Florian; Glück, Hendrik; Schmidt, AlbrechtAdding notes to time segments on a video timeline makes it easier to search, find, and play-back important segments of the video. Various approaches have been explored to annotate videos (semi) automatically to summarize videos. In this research we investigate the feasi-bility of implicitly annotating videos based on brain signals retrieved from a Brain-Computer Interface (BCI) headset. The signals provided by the BCI can reveal different infor¬mation such as brain activities, facial expressions, or the level of users excitement. This in¬formation correlates with scenes the users watch in a video. Thus, it can be used for anno¬tating a video and automatically generating a summary. To achieve the goal, an annotation tool called MediaBrain is developed and a user study is conducted. The result reveals that it is possible to annotate a video and select a set of highlights based on the excitement information.
- KonferenzbeitragTaxiMedia: An interactive context-aware entertainment and advertising system(Informatik 2009 – Im Focus das Leben, 2009) Alt, Florian; Sahami Shirazi, Alireza; Pfeiffer, Max; Holleis, Paul; Schmidt, Albrecht