Towards pervasive eye tracking
dc.contributor.author | Ksaneci, Enkelejda | |
dc.date.accessioned | 2018-01-23T13:04:31Z | |
dc.date.available | 2018-01-23T13:04:31Z | |
dc.date.issued | 2017 | |
dc.description.abstract | The human gaze provides paramount cues for communication and interaction. Following this insight, gaze-based interfaces have been proposed for human-computer interaction (HCI) since the early 90s, with some believing that such interfaces will revolutionize the way we interact with our devices. Since then gaze-based HCI in stationary scenarios (e. g., desktop computing) has been rapidly maturing, and the production costs of mainstream eye trackers have been steadily decreasing. In consequence, a variety of new applications with the ambitious goal to apply eye tracking to dynamic, real-world HCI tasks and scenarios have emerged. This article gives an overview of the research conducted by the Perception Engineering Group at the University of Tübingen. | en |
dc.identifier.pissn | 1611-2776 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/14932 | |
dc.language.iso | en | |
dc.publisher | De Gruyter | |
dc.relation.ispartof | it - Information Technology: Vol. 59, No. 5 | |
dc.subject | Eye tracking | |
dc.subject | eye movements | |
dc.subject | scanpath | |
dc.subject | pupil detection | |
dc.subject | autonomous driving | |
dc.title | Towards pervasive eye tracking | en |
dc.type | Text/Journal Article | |
gi.citation.endPage | 257 | |
gi.citation.publisherPlace | Berlin | |
gi.citation.startPage | 253 | |
gi.conference.sessiontitle | Self-Portrayals of GI Junior Fellows |