In live music performance with traditional acoustic instruments, there is a direct one-to-one connectedness between instrument and performer that promotes bodily expression. However, live music that is mediated by computer processes, such as those involving Ableton Live, is generally controlled remotely by peripheral devices such as the mouse and midi controller. These controls are not only disconnected and unintuitive, but they are governed by minute mechanisms such as buttons, knobs and sliders that require micro movements which fail to fully involve the performer's body. As a result, the performer can be forced to exaggerate their movements in order to communicate the performance to an audience. On top of this, the introduction of the laptop to the stage presents further issues that affect the energy between audience and performer. The laptop has been found to pose a distraction to the performer and negatively affect how the audience perceives the authenticity of a performance.
The research of GraspAbleton explores a solution to the above issues via tangible computing. As an affront to the button/knob/slider paradigm, the system instead makes playful use of everyday objects as the basis for control over Ableton Live. These objects can be felt, grasped and spatially relocated anywhere around the performance area in a way that is much more bodily liberating to the performer and visually stimulating to a watching audience. The system also eliminates the need to reference a screen, as the position and orientation of these objects themselves form the only visual feedback that the performer requires in order to gauge the state of the system. GraspAbleton was developed with the cooperation of both experienced Ableton Live users and those who had never used the program before, meaning that it is not just musically performable, but also fun and engaging to interact with.