Caputo, ManuelDenker, KlausDums, BenjaminUmlauf, GeorgReiterer, HaraldDeussen, Oliver2017-11-222017-11-222012978-3-486-71879-9https://dl.gi.de/handle/20.500.12116/7785With the advent of various video game consoles and tablet devices gesture recognition got quite popular to control computer systems. E.g. touch screens allow for an intuitive control of small 2d user interfaces with finger gestures. For interactive manipulation of 3d objects in a large 3d projection environment a similar intuitive 3d interaction method is required. In this paper, we present a dynamic 3d hand and arm gesture recognition system using commodity hardware. The input data is captured with low cost depth sensors (e.g. Microsoft Kinect) and HD color sensors (e.g. Logitech C910). Our method combines dynamic hand and arm gesture recognition based on the depth sensor with static hand gesture recognition based on the HD color sensor.en3d hand gesturessensor fusioncommodity hardware3d interaction3D Hand Gesture Recognition Based on Sensor Fusion of Commodity HardwareText/Conference Paper