Logo des Repositoriums
 

Perception for Everyday Human Robot Interaction

dc.contributor.authorWorch, Jan-Hendrik
dc.contributor.authorBálint-Benczédi, Ferenc
dc.contributor.authorBeetz, Michael
dc.date.accessioned2018-01-08T09:22:59Z
dc.date.available2018-01-08T09:22:59Z
dc.date.issued2016
dc.description.abstractThe ability to build robotic agents that can perform everyday tasks heavily depends on understanding how humans perform them. In order to achieve close to human understanding of a task and generate a formal representation of it, it is important to jointly reason about the human actions and the objects that are being acted on. We present a robotic perception framework for perceiving actions performed by a human in a household environment that can be used to answer questions such as “which object did the human act on?” or “which actions did the human perform?”. To do this we extend the RoboSherlock framework with the capabilities of detecting humans and objects at the same time, while simultaneously reasoning about the possible actions that are being performed.
dc.identifier.pissn1610-1987
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/11507
dc.publisherSpringer
dc.relation.ispartofKI - Künstliche Intelligenz: Vol. 30, No. 1
dc.relation.ispartofseriesKI - Künstliche Intelligenz
dc.titlePerception for Everyday Human Robot Interaction
dc.typeText/Journal Article
gi.citation.endPage27
gi.citation.startPage21

Dateien