Visual Versus Verbal Location Information on the iPhone
dc.contributor.author | Juhasz, Orsolya Emoke | |
dc.contributor.author | Tenbrink, Thora | |
dc.contributor.author | Grüter, Barbara | |
dc.date.accessioned | 2018-01-08T09:15:48Z | |
dc.date.available | 2018-01-08T09:15:48Z | |
dc.date.issued | 2012 | |
dc.description.abstract | Mobile games become more and more embedded in our everyday lives. In this industry, particular types of spatial information are often given predominantly by visual means, while verbal and other sensorial feedback (vibration) are used for additional or different information. Since this may provide an obstacle for some users in some contexts, exploring other ways of conveying equivalent location information may facilitate the development of successful and engaging future mobile games. This paper focuses on how the same location information, given either visually or verbally, affects player performance within a mobile game. We present an explorative study using a simple, location-based game on the iPhone, testing users’ reactions to the two types of spatial information. The results, which reflect a high amount of individual variation but no negative effects on performance, are discussed by opening up the space of possibilities for future designs. | |
dc.identifier.pissn | 1610-1987 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/11284 | |
dc.publisher | Springer | |
dc.relation.ispartof | KI - Künstliche Intelligenz: Vol. 26, No. 2 | |
dc.relation.ispartofseries | KI - Künstliche Intelligenz | |
dc.subject | Location information | |
dc.subject | Mobile games | |
dc.title | Visual Versus Verbal Location Information on the iPhone | |
dc.type | Text/Journal Article | |
gi.citation.endPage | 186 | |
gi.citation.startPage | 183 |