Abstract
Communicative hand gestures play important roles in face-to-face conversations. These gestures are arbitrarily used depending on an individual; even when two speakers narrate the same story, they do not always use the same hand gesture (movement, position, and motion trajectory) to describe the same scene. In this paper, we propose a framework for the classification of communicative gestures in small group interactions. We focus on how many times the hands are held in a gesture and how long a speaker continues a hand stroke, instead of observing hand positions and hand motion trajectories. In addition, to model communicative gesture patterns, we use nonverbal features of participants addressed from participant gestures. In this research, we extract features of gesture phases defined by Kendon (2004) and co-occurring nonverbal patterns with gestures, i.e., utterance, head gesture, and head direction of each participant, by using pattern recognition techniques. In the experiments, we collect eight group narrative interaction datasets to evaluate the classification performance. The experimental results show that gesture phase features and nonverbal features of other participants improves the performance to discriminate communicative gestures that are used in narrative speeches and other gestures from 4% to 16%.
Artifacts
Information
Book title
2013 International Conference on Multimodal Interaction (ICMI 2013)
Pages
303-310
Date of issue
2013/12/09
Date of presentation
2013/12/09
Location
Sydney, Australia
DOI
10.1145/2522848.2522898
Citation
Shogo Okada, Mayumi Bono, Katsuya Takanashi, Yasuyuki Sumi, Katsumi Nitta. Context based conversational hand gesture classification in narrative interaction, 2013 International Conference on Multimodal Interaction (ICMI 2013), pp.303-310, 2013.