Timing of discrete musical air-gestures
- Motion sensing technologies enable musical interfaces where a performer can control sound by moving their body ``in the air, '' without manipulating or contacting a physical object. These interfaces work well when the movement and resulting sound are smooth and continuous. However, it has proven difficult to design a system which triggers discrete sounds with precision that feels natural to the performer and allows them to play complex rhythmic music. This thesis presents a study into "air-drumming" gestures. Participants performed drumming-like gestures in time to simple recorded rhythmic drum sounds. These movements are recorded and then examined to look for aspects of the movement that correspond to the timing of the drum sounds. The goal of this research is to understand what we do with our bodies when we gesture in the air to trigger a sound, with the hope that the results can be used to design more accurate and responsive air-gesture-controlled instruments. It is assumed that when we gesture in the air to generate a sound, we do something with our bodies to create a sensation of a moment in time and that we intend the sound to occur at this time. Eight movement features are extracted from the movement data, and it is hypothesized that one or more of these movement features correspond to this body sensation. Four of the movement features are based on tracking the location of the hand and wrist. Hits are moments where the hand or wrist suddenly change direction at the end of a strike gesture, and a novel algorithm for detecting hits is presented. And acceleration peaks are peaks in the magnitude acceleration of the hand or wrist as they suddenly decelerate at the end of the gesture . The remaining four movement features are based on the angles of the elbow and wrist joints. Extrema in joint angle are detected, as are sudden peaks in angular acceleration of the joint. The timing of these movement events is analyzed with respect to the timing of the drum sounds. Analysis is also performed comparing how the timing of movement events changes when speed at which notes are repeated changes. In general it is found that features based on acceleration peaks occur earlier, closer to the audio event, and with less noise, and that these features vary less with respect to note speed. It is also shown that timing differences between hand and wrist features can be used to study individual differences and group performers according to movement style.
|Type of resource
|electronic; electronic resource; remote
|1 online resource.
|Dahl, Luke Samuel
|Stanford University, Department of Music.
|Statement of responsibility
|Luke Samuel Dahl.
|Submitted to the Department of Music.
|Thesis (Ph.D.)--Stanford University, 2014.
- © 2014 by Luke Samuel Dahl
- This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).
Also listed in
Loading usage metrics...