Touch Music Tables
-
lemur
year: 2004
author(s): JazzMutant
website: http://www.jazzmutant.com/lemur_overview.php
LEMUR is a handy and modular touch panel based controller designed for audio and multimedia real-time applications. Our technology associates multitouch capabilities with visual display. LEMUR is
provided with an extensible library of User Interface Objects such as faders, switches, pads, keyboards, strings, etc. A software editor (Windows, Mac OSX, Linux) allows the user to build easily his
own interfaces by dragging and dropping objects on the control area. The JazzMutant software editor also provides advanced tools to map and setup objects and control parameters.
-
Composition on the Table
year: 1998
author(s): Toshio Iwai
affiliation: Artist, Japan
publication: Composition on the Table [PDF]
website: http://www.ntticc.or.jp/Archive/1999/+-/Works/conposition_e.html
Four white tables have various user interfaces such as switches, dials, turn-tables and sliding boards that a player can touch. Projectors suspended from the ceiling project computer generated images onto the tables and interfaces. Projected images change in real time as if they were physically attached to the interfaces when players operate them. Also sounds are produced in relation to the movement of images.
-
Stereotronic Multi-Synth Orchestra
year: 2009
author(s): Robert Lewis, Zach Archer
affiliation: Fashionbuddha
website: http://www.fashionbuddha.com/
The Stereotronic Multi-Synth Orchestra is a new way to visualize music. The basic sequencer layout consists of concentric rings, where the freely placed notes are triggered by the radial sweeps. Each sound event can be additionally edited with an internal 16-step sequencer. Due to its multi-touch design the application encourages collaborative musical creation. This application has been built with the XNA framework, but it also reads TUIO data via UDP. This means it can work with either the MS Surface or DIY tables without any modification.
-
jam-o-drum
year: 2000
author(s): Tina Blaine,
Tim Perkis
affiliation: Carnegie Mellon University,
Entertainment Technology Center
publication: Jam-O-Drum Interactive Music System: A Study in Interaction Design [PDF]
website: http://www.jamodrum.net/
By combining velocity sensitive input devices and computer graphics imagery into an integrated tabletop surface, six to twelve simultaneous players are able to participate in a collaborative approach to musical
improvisation. The Jam-O-Drum was designed to support face-to-face audio and visual collaboration by playing on drum pads embedded in the surface to create rhythmical music and effect visual changes together using the community drum circle as a metaphor to guide the form and content of the interaction design. The Jam-O-Drum is a permanent exhibit in the heart of Sound Lab at the EMP.
-
Soundplane
year: 2008
author(s): Randall Jones
publication: Intimate Control for Physical Modeling Synthesis [PDF]
website: https://madronalabs.com/soundplane
A new, low-cost sensor design capable of generating a 2D force signal, a new implementation of the 2D digital waveguide mesh, and two experimental computer music instruments that combine these components using different metaphors. Multidimensional connections between sensors and a physical model are found to facilitate a high degree of control intimacy, and to reproduce as emergent behavior some important phenomena associated with acoustic instruments.
-
akustisch
year: 2008
author(s): Balz Rittmeyer
affiliation: University Of Art And Design Zurich (ZHdK)
website: http://akustisch.digitalaspekte.ch/
'akustisch' is a new approach to the use of a multitouch interface for the production, control and manipulation of digital sound. Designed as a synthesizer without a traditional graphical user interface, the akustisch employs complex hand gestures on a multitouch surface to control and create music.This approach allows for all ten fingers to be used for sound manipulation, whereby an algorithm interprets their movements on the screen as a gestural input. As soon as a gesture is recognized, a specific sound output is generated during its execution.
-
sound storm
year: 2009
author(s): Christian Bannister
affiliation: Subcycle Labs
website: http://subcycle.org/
A vertically mounted multi-touch surface with several musical and audiovisual applications including features such as sound visualization, break beat performance engine, sonic navigation, sound storm visualization, time machine, interactive sound, time stretch tabla and more.
-
FTIR Musical Application
year: 2006
author(s): Philip Davidson,
Jefferson Han
affiliation: New York University,
Media Research Lab
publication: Synthesis and Control on Large Scale Multi-Touch Sensing Displays [PDF]
website: http://mrl.nyu.edu/~jhan/ftirtouch/
A musical interface for a large scale, high-resolution, multi-touch display surface based on frustrated total internal reflection. Multi-touch sensors permit the user fully bi-manual operation as well as chording gestures, offering the potential for great input expression. Such devices also inherently accommodate multiple users, which makes them especially useful for larger interaction scenarios such as interactive tables.
-
AudioTouch
year: 2008
author(s): Seth Sandler
affiliation: ICAM, University of California, San Diego
website: http://ssandler.wordpress.com/audiotouch/
AudioTouch is an interactive multi-touch interface for computer music exploration and collaboration. The interface features audio based applications that make use of multiple users and simultaneous touch inputs on a single multi-touch screen. A natural user interface where users can interact through gestures and movements, while directly manipulating musical objects, is a key aspect of the design; the goal is to be able to interact with the technology (specifically music based) in a natural way. The AudioTouch OS consists of four main musical applications: MultiKey, MusicalSquares, Audioshape sequencer & Musical Wong
-
realsound
year: 2003
author(s): Radboud Mens,
Aldje van Meer
affiliation: Artists, Netherlands
website: http://www.xs4all.nl/~meermens/realsound/
An interactive installation combining image and sound through a tangible interface. By operating 24 buttons in on the table top he visitors can create an audio-visual composition of their own design. Each button has its own unique sound that is linked to an image.
-
MUSICtable
year: 2005
author(s): Ian Stavness,
Jennifer Gluck
affiliation: HCT Laboratory, University of British Columbia
publication: The MUSICtable: A Map-Based Ubiquitous System for Social Interaction with a Digital Music Collection [PDF]
website: http://hct.ece.ubc.ca/research/musictable/
An ubiquitous system that utilizes spatial visualization to support exploration and social interaction with a large music collection. The interface is based on the interaction semantic of influence, which allows users to affect and control the mood of music being played without the need to select a
set of specific songs.