Penne, JochenSoutschek, StefanSchaller, ChristianJoachim, HorneggerLucke, UlrikeKindsmüller, Martin ChristofFischer, StefanHerczeg, MichaelSeehusen, Silke2017-11-222017-11-222008978-3-8325-2007-6https://dl.gi.de/handle/20.500.12116/7055Contactless Human-Machine-Interfaces (HMIs) are an important issue in various applications where a haptic interaction with an input device is not possible or not appropriate. Newly developed Time-of-Flight cameras provide 3D information of the observed scene in real-time at constant lateral resolutions of thousands of pixels. Additionally, a gray-value image of the observed scene is available. Our work compromises three major contributions: First, the robust and real-time capable segmentation of the hand by incorporating 3D and gray-value information; Second, the reliable classification of the performed static gesture using robust features; Third, the design of an HMI which uses the classified gesture as well as the 3D position of the hand to enable complex and convenient user interactions. The benefit of using a ToF camera is that the 3D information is just not available from classical 2D camera systems and thus only with ToF cameras the three dimensions of freedom which are given for non-haptic interactions can be fully used. Currently, classification rates of 98.2% are achieved user-dependent and 94.3% user-independent for 6 gestures. Tests with untrained persons yielded a good to very good acceptance of the HMI.enRobust Real-Time 3D Time-of-Flight Based Gesture NavigationText/Conference Paper