19
This article was downloaded by: [Ondokuz Mayis Universitesine] On: 06 November 2014, At: 12:13 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Journal of New Music Research Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/nnmr20 Collaborative Musical Experiences for Novices Tina Blaine & Sidney Fels Published online: 09 Aug 2010. To cite this article: Tina Blaine & Sidney Fels (2003) Collaborative Musical Experiences for Novices, Journal of New Music Research, 32:4, 411-428 To link to this article: http://dx.doi.org/10.1076/jnmr.32.4.411.18850 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

Collaborative Musical Experiences for Novices

  • Upload
    sidney

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Collaborative Musical Experiences for Novices

This article was downloaded by: [Ondokuz Mayis Universitesine]On: 06 November 2014, At: 12:13Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: MortimerHouse, 37-41 Mortimer Street, London W1T 3JH, UK

Journal of New Music ResearchPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/nnmr20

Collaborative Musical Experiences for NovicesTina Blaine & Sidney FelsPublished online: 09 Aug 2010.

To cite this article: Tina Blaine & Sidney Fels (2003) Collaborative Musical Experiences for Novices, Journal of New MusicResearch, 32:4, 411-428

To link to this article: http://dx.doi.org/10.1076/jnmr.32.4.411.18850

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose ofthe Content. Any opinions and views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be reliedupon and should be independently verified with primary sources of information. Taylor and Francis shallnot be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and otherliabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to orarising out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Collaborative Musical Experiences for Novices

Abstract

We explore the context and design of collaborative musicalexperiences for novices. We first argue that musical expres-sion with multi-person instruments is a form of communica-tion between the players. We illustrate that design for musicalcollaboration facilitates exploration of sound space with lowentry-level skill. In contrast to the western post-Renaissancefocus on musical expression through virtuosity, collaborativemusical experiences enable the media of sound and music to enhance the communication opportunities and intimacybetween players. The main factor common to most of theinterfaces discussed herein is that musical control is highlyrestricted, which makes it possible for novices to easily learnand participate in the collective experience. This happens atthe expense of providing an upward path to virtuosity withthe interface. Balancing this tradeoff is a key concern fordesigners. We look closely at many contemporary collabora-tive interface designs, with each exploring a different way toachieve successful musical experiences for novice players.

1. Introduction

Since the beginning of time, music has been used for the pur-poses of ritual, spiritual fulfillment, celebratory events, andperhaps most importantly, as a cohesive force in communi-ties. The ability to express oneself through music is also aform of social interplay and communication. Today, with the rapid development of new sensor technologies and inex-pensive computing hardware and software, the potential toinclude people with little or no musical training in the act of making music has never been greater. We explore manyresearchers’ efforts to exploit this potential for creating col-laborative musical experiences for novices.

The underlying premise of most collaborative instrumentdesign is that with various design constraints, playing music

can be made accessible to non-musicians. Multiplayer instru-ments explore the communication and expression betweenpeople, with musical control being the medium. We arguethat analyzing the collective musical experience of collabo-rative interfaces should be examined in this context.

In this article, we focus on the severe limitations imposedby most designers of collaborative interfaces on the musicalrange and possible gestures associated with sound, so thatnovices can easily participate in the collective sound space.In collaborative experiences, these constraints minimize thepotential sense of exclusion. If a player feels excluded dueto a perceived lack of skills, she does not have a positiveexperience. This is often the case with traditional musicalinstruments that require significant practice to play well.Essentially, low-level accessibility is necessary for people toparticipate and interact with the instruments and each other.Finally, most of the interfaces we consider are intended forpublic exhibition, where people casually “walk-up and play.”This restricts the amount of time that a designer can expectsomeone to spend learning an interface and necessitateshighly constrained interfaces that are conducive to musicallyaccessible flow-through experiences.

2. Context

The emergence of electronic instruments, and most notablythe computer, has led to the creation of new interfaces andsounds never before possible. In addition, the computer canbe used to create arbitrary mappings between gesture andsound, thereby providing the possibility of computer-supported sound and directed musical interaction. Conse-quently, a wave of new types of collaborative interfaces andgroup experiences has emerged for collaborative musicmaking. Some early explorations in this direction are found

Accepted: 30 April, 2003

Correspondence: Tina Blaine, Carnegie Mellon University, Entertainment Technology Center, Pittsburgh, PA 15123, USA. Tel.: +1 412 2682980; E-mail: [email protected]

Collaborative Musical Experiences for Novices

Tina Blaine1 and Sidney Fels2

1Carnegie Mellon University, Entertainment Technology Center, Pittsburgh, PA, USA; 2Dept. of Electrical & ComputerEngineering, University of British Columbia, Vancouver, BC, Canada

Journal of New Music Research 0929-8215/03/3204-411$16.002003, Vol. 32, No. 4, pp. 411–428 © Swets & Zeitlinger

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 3: Collaborative Musical Experiences for Novices

412 Tina Blaine and Sidney Fels

in works such as Mikrophonie I and II (Stockhausen,1963/1964), Laboratorium (Globokar, 1973), and groupssuch as the League of Automatic Music Composers(Bischoff, Gold, & Horton, 1978) and the Hub, experimen-tal computer network bands (Gresham-Lancaster, 1998). InMikrophonie I, expert percussionists play a collective instru-mented Tam-Tam that has controls for augmenting the soundintended for 2 sets of three players. In Mikrophonie II, a four-person chorus along with music from an organ is fed througha ring modulator played simultaneously. The League of Auto-matic Music Composers was the first computer network trio.The Hub connected six electronic musicians in a computernetwork that made each player’s activity accessible to otherplayers thus enabling the group to engage in collectivemusical improvisation. The tradition of collaborative instru-ments for expert play and performance where a second playermodulates, either mechanically or electronically, the live per-formance of another expert player continues in works suchas Contacts Turbulants using the “catch and throw” metaphor(Wessel & Wright, 2002) and the performance using the Photosonic instrument (Dudon & Arfib, 2002).

Perhaps as a reflection of the dominant Western view thatmusic should be played only by musicians, most work in thelast thirty years using new interfaces for musical expressionis primarily oriented toward virtuosic experiences and per-formances. With virtuoso-style instruments, the designer canreasonably expect the player to invest a significant amount of time learning the idiosyncrasies of the instrument. Thus,as discussed in (Fels, Gadd, & Mulder, 2002), the task ofkeeping the adopter of the instrument engaged, involvesdesigning for increasing complexity and expressivity as theplayer becomes expert. In the extreme, maintaining the satisfying state of flow requires a lifelong path of learningnew complexities for enhanced expression (Csikszentmiha-lyi, 1990). In general, this direction is not viable for collab-orative interfaces for novices. Instead, making the interfaceseasy to learn and play takes highest priority. Ultimately, thedevelopment of expert-level performance is not the issue, butthe ability for players to increase their skills over a shortperiod of time.

2.1 Collaborative musical experience

. . . achieving control over experience requires a drastic change in attitude about what is important and what is not. (Csikszentmihalyi, 1990)

Dissecting the role of music in relation to people’s experiencesplaying collaborative interfaces requires a shift in perspective.By attributing less relevance to the importance of traditionalmusic metrics based on melody, more emphasis can be placedon metrics that involve the players’ experience. When design-ing collaborative musical experiences for first-time players inpublic places, the amount of time necessary to learn an inter-face must be minimized, coupled with achieving a balancebetween virtuosity and simplicity (D’Arcangelo, 2001).

Music becomes the context within which the experience-designer creates a communal environment. An environmentthat contributes to improved participation, spontaneity andcooperation among all group members is one of the basic principles of teamwork. (Cirigliano & Villaverde, 1966).Thus, participation in music making provides each player with a sense of belonging and access to a new community. Particularly in public installation settings, players participatecollectively in the creation of an overall composition, with or without a high level of proficiency.

Providing entertaining musical experiences that promoteparticipation and engagement generally serve to enhancecommunication between players. However, like many ele-ments that make for a qualitative group experience, evaluat-ing “fun” is completely subjective and therefore difficult to quantify. The state of “flow” (Csikszentmihalyi, 1990)between people is another of these elements. While it isusually obvious when people are having fun, it is an ambigu-ous task to assess the metrics of fun. LeBlanc’s taxonomy ofgame pleasures promotes emergent complexity as a possiblesource of fun. Such emergence creates opportunities forexploration by providing more features for players to dis-cover and more challenges to meet. (Kreimeier, 2000). Otherrelevant gaming principles LeBlanc attributes to the “sourceof fun” include expression, sensation and fellowship betweenplayers. (Costikyan, 2002).

Within a group, members may have different musical abil-ities. In the realm of spontaneous flow-through collaborativemusical environments, previous musical knowledge does notnecessarily give an expert player an advantage as the inter-face may be novel to all participants. Leaders may alsoemerge to help facilitate the group’s play experience in aneffort to minimize musical chaos. This form of “distributedleadership” gives all participants the opportunity to developtheir potential to improve the group’s capabilities. (Cirigliano& Villaverde, 1966). Mixed abilities can cause difficulties ifone player dominates the activity at the expense of exclud-ing others not so proficient. Careful design can mitigate thisby insuring that balanced collective participation is requiredso that all members of the group are engaged in the activityand having fun.

In a general sense, co-located, collaborative musical expe-riences may be considered a form of single display group-ware (SDG) (Stewart, Bederson, & Druin, 1999) where the display is predominately audio based. As indicated byStewart et al., these interfaces support creativity, learning andinstruction. Most collaborative musical interfaces for novicesprovide for all these activities; however, their support is con-strained by the accessibility of the interface. Essentially, fourforms of complex group task behaviors can be sustainedwithin collaborative musical experiences: choosing, generat-ing, executing and negotiating (McGrath & Hollingshead,1994). Particularly for novices, some of the tasks are moreimportant than others and thus, the design of the interfacemust take that priority into account. For example, determin-ing the sounds that each player creates requires a negotiation

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 4: Collaborative Musical Experiences for Novices

Collaborative musical experiences 413

task to occur at the onset of the musical experience. Provid-ing support for turn-taking protocols may also be necessary.Once established, players can create and explore the genera-tion of new sounds in parallel. Highly skilled players needmore sophisticated methods to perform the tasks which fit within the context of understanding group behavior,whereby all the members are either producing, supporting or contributing to the well-being of the group (McGrath &Hollingshead, 1994).

2.2 Player capacity and aptitude

Collaboration between players producing sound and musiccan broadly be classified within two general design criteria;capacity and aptitude. Capacity addresses the number ofplayers the interface will accommodate. Aptitude considersthe skill of the target demographic as novice or expert players.In Table 1, at one end of the capacity scale, the interface isdesigned for one person but affords interaction with multipleplayers. An example of this is a piano played as a duet. At theother end of the spectrum, are interfaces designed specificallyfor more than one player. Within both categories, the collab-orative interface may be played using either a single interface(i.e., a piano) or a separate instrument for each player, as ina jazz ensemble. With regard to players’ level of expertise, wesee collaborative interfaces intended for either novices orexperts. This distinction is not meant to imply that a musicalexperience designed for novices will not support expert per-formance and vice-versa. Rather, the intent of the design pri-oritizes ease-of-use and easy access.

The distinction between multiple players each having aseparate interface comprising an ensemble, versus multipleplayers sharing one interface, significantly affects the overallsonic output. In an ensemble, players perform on their owninstrument and each instrument creates individual timbresthat contribute to the combined group sound. With a collec-

tive interface, players generally contribute to produce aunified sound from the same sound source. (reviewercomment, 2003). Essentially, the combination of the players’activities contributes to the global mix in both cases, but themethods and opportunities for expert play vary accordingly.Our focus in this article is the first column of Table 1, thatis, collaborative interfaces designed for novice experience.

2.3 Complexity v. expressivity: What are the tradeoffs?

“Interactive instruments embody all of the nuance, power, andpotential of deterministic instruments, but the way they functionallows for anyone, from the most skilled and musically talentedperformers to the most unskilled members of the large public,to participate in a musical process.” (Chadabe, 2002)

Historically, the field of musical controllers has advanced pri-marily through the creation of highly complex single playerinstruments developed for experts, as opposed to multiplayerinterfaces/environments designed for novices. Perry Cook, a pioneering instrument builder, is an avid proponent of creating “playful” interfaces that avoid the look and feel oftraditional instruments (Cook, 2001). Developing musicalinterfaces using familiar objects that ordinarily serve anotherpurpose, or inventing entirely new instruments, can changethe level of musical expectation by redefining “expert” and“novice” interplay as the basis for engagement. Designers ofcollaborative devices that are easy to control but have limitedexpressive capabilities are challenged not only to conceive ofopportunities for musical exploration, but also must cultivatemeaningful social interactions and experiences for theplayers. The control functionality of the interface or under-lying technology should be comprehensible and transparent.It should also inspire interaction between the players, ratherthan focus on the interaction between the player and the inter-face. In a collaborative musical environment, it becomes

Table 1. Examples of Design Criteria for Collaborative Musical Experiences.

Aptitude

Novice Virtuoso

Capacity Single player Single interface Electronic Bullroarer Duet on pianoIamascope

Multiple Musical Trinkets Jazz Ensemblesinterfaces

Multiple Single interface Beatbugs Mikrophonie I, Tookaplayers Squeezables

Audio GroveSound MappingSpeaking OrbsJamodrum

Multiple Augmented Groove Mikrophonie IIinterfaces Brain Opera

Drum Circle

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 5: Collaborative Musical Experiences for Novices

414 Tina Blaine and Sidney Fels

even more imperative that the technology serves primarily asa catalyst for social interaction, rather than as the focus ofthe experience (Robson, 2001).

Conversely, interfaces that have extended expressive capa-bilities tend to be more difficult to control and cater more tothe expert player. As with traditional instruments, the moredifficult an interface is to control, the more intimidating itwill be for entry-level players. For designers of most musicalinterfaces, the overriding challenge is to strike a balance ofmultimodal interaction using discrete and continuous con-trols (Tanaka & Knapp, 2002; Verplank, 2001), and gener-ally, limit rather than increase the number of features andopportunities for creativity (Cook, 2001). Finding thebalance between virtuosity and simplicity provides fertileground for new collaborative experiences. The flexibility of the computer provides designers with a wealth ofinput/output options, new sound making techniques and aninfinite variety of mappings that have opened the door tocompletely new ways of creating multiplayer interfaces fornovices.

2.4 Design practice

Natural mapping behaviors evolve from the creation of adirect relationship between gesture and musical intent. Trulyintuitive mapping further takes into account socio-culturalsensibilities as well as metaphors that motivate physical andsonic analogies. Players’ perception of control in collabora-tive musical environments can be increased by creating pre-determined musical events, subject to players manipulatingcomplex parameters of sound through gestures, such asstretching or squeezing (Weinberg & Gan, 2001). Enhancingthe illusion of control can also be achieved with supplemen-tal effects such as lighting, visual imagery and more, to createa highly responsive system based on player input. While theuse of pre-composed musical events or sequences severelylimits certain aspects of an individual’s creative control, it hasthe benefit of creating more cohesive sound spaces in multi-player environments. With these mappings, players are notresponsible for playing specific notes, scales or harmonies,which helps to minimize chaotic musical interaction.

Design issues regarding the input interface, input-to-output mapping and the output interface are of the utmostrelevance. The design of these interfaces and mappings is thetopic of much research.1 Thus, the type of collaborative inter-face depends on a number of dimensions including: range,sensor, directed interaction, and pathway to expert perfor-mance. Good design practice for these interfaces, whethercooperative or not, overlaps with issues regarding human-computer interaction (Orio, Schnell, & Wanderley, 2001).Such issues include: usability, ease of learning, and func-

tionality to name a few. Of course, as these interfaces areintended to provide an expressive experience, additionalfactors must be considered as well. While important in theirown right, in this article, HCI design factors are discussed inspecific relation to their effects on the success of the collab-orative musical experience.

3. Contemporary directions for collaborative interfaces

Table 2 lists a number of collaborative interfaces for musicalexperiences. Each system is unique and offers more subtletyof design than the specific focus of this article. We haveattempted to provide a wide range of different systems,though certainly not all-inclusive, to address some of theissues encountered in configuring multiplayer collaborativeinterfaces and experiences. When considering the design ofthese interfaces, the most significant context elements are:Focus, Location, Media, Scalability and Player Interac-tion. Other constraints and design parameters considered in the creation of meaningful experience within a musicalmedium include Musical Range, Physical Interface/Sensor, Directed Interaction, Learning Curve, Pathway toExpert Performance and Level of Physicality betweenPlayers (and Instrument). These are summarized in Table 2and discussed in the following subsections. Generally, thedesigners of the collaborative musical experiences fornovices discussed herein have made similar choices regard-ing the context and constraints of development, with inter-esting variations in elements of system design. We describeeleven systems in more detail in Section 4.

3.1 Focus

The sonic output of most musical interactions is focusedtoward an audience. However, with the collaborative inter-faces listed in Table 2, the sound and music are primarilyintended to enhance the social interaction between players.This may or may not be very interesting for audiences tolisten to, as it is sometimes difficult to ascertain subtle inter-actions between players on an unfamiliar interface. In con-trast, in keeping with the traditions of an orchestra or anensemble in live performance, an audience’s appreciation ofthe effort players make to create music can enhance musicalexpression. To achieve this, some mechanism must be inplace for audiences to understand the relationship betweenplayers’ control and the music produced. As discussed in Felset al. (2002), this is called increasing the audience’s trans-parency. Generally with novice participants, the sound gen-erated within collaborative musical environments is intendedfor the players.

3.2 Location

Many collaborative interfaces for musical expression arecreated as installations for public exhibition. In these

1 (Paradiso, 1997), Organized Sound special issue on mappings andthe New Interfaces for Musical Expression (NIME) proceedings alladdress these design issues.

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 6: Collaborative Musical Experiences for Novices

Collaborative musical experiences 415

Table 2. Dimensions of Collaborative Interface Design.

System Media Scale Player Musical Range/ Physical Directed Learning Path to Level ofInteraction Notes Interface/ Interaction Curve Expert Physicality

Sensor Performance BetweenPlayers (&Interface)

Audio Grove Sound, 1–30 Players touch Players control DSP Touch, Medium Fast No High(Moeller, Light, 5.5m rods Capacitive1997) Device mounted on large sensing

platform

Augmented Sound, 1–3 Players move Players control DSP Camera, Head- Medium – Med – Fast No HighGroove Image, disks detected by (filters, effects and mount display, High(Poupyrev Device camera samples) over pre- Glyph disks w/facilitatoret al., 2001) composed loops present to

explainmapping

Beatbugs Sound, 1–8 Players hit Players control InfraRed, Bend High – Relies Slow Possibly High(Weinberg Device embedded piezo rhythmic input & sensors, Piezos upon traininget al., 2002) and manipulate DSP – filters, workshops,

bend sensors oscillators, distributedmodulators, noise leadershipgenerators, etc.,rhythmic ornamentation

Bullroarer Sound, 1–3 Players spin Players manipulate Slider Low Fast No High(Robson, Device digital aerophone drones, tempo, Potentiometer2001) rhythmic density and

sequences of notes

Composition Image, 2–6 Button presses Players control Buttons, Low Fast No Medon the Table Sound, allow players to rhythm and loop(Iwai, 1998) Light, change the constructs via MIDI.

Device direction of arrows Pitch is pre-associated with determinedeach node tocreate loops &rhythms

Crackle Sound, 2–4 Players combine Limited – analogue Touch, Low Fast No HighStage, Device sound producing electronic sounds ConductivityCrackle Tea circuits w/kitchen w/FoilParty implements, liquid Condensors(Waisvicz & finger contact& Balde,1976)

Currents of Image, 1–6 Players launch Pre-composed loops Computer kiosk High Fast No MedCreativity Sound, visual collages (lightbox)(D’Arcangelo, Device w/music2001) corresponding to

graphic images

Iamascope Image, 1–3 Player’s Players’ gestures Camera Low Fast No High(Fels & Sound movement is trigger harmonicMase, 1999) detected by video musical soundsets

camera & mappedto musical zones

Interactive Image, 10– Players break IR Players trigger pre- Varied; Infrared Medium Fast No HighDance Club Sound, 100’s & light beams, composed musical sensors, Overall(Ulyate & Device hit/step on pads, phrases, chords, taptiles, experienceBianciardi, etc. percussion loops and proximity facilitated by2001) effects. sensors, etc. Experience

jockey (EJ)

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 7: Collaborative Musical Experiences for Novices

416 Tina Blaine and Sidney Fels

Table 2. Continued.

System Media Scale Player Musical Range/ Physical Directed Learning Path to Level ofInteraction Notes Interface/ Interaction Curve Expert Physicality

Sensor Performance BetweenPlayers (&Interface)

Jamodrum Image, 1–12 Players hit drum Midi triggered Drum pads Medium – Med – Fast No High(Blaine & Sound pads samples and High VirtualPerkis, 2000) percussive sounds facilitator

elicits CallandResponsebehavior

Jamoworld Image, 1–4 Players turn disks Pre-composed Drum pads + Medium – Med – Fast No High(Blaine & Sound and hit drum pads musical segments + turn-table disks HighForlines, real-time MIDI w/optical Relies on2002) events encoders distributed

leadership

MidiBall Sound, 1– Audience MIDI triggered RF Low Fast No High(Jacobson, Image, 20,000 members hit ball sampleset al., 1993)8 Device at random

Music Sound, 1–5 Players wear Arpeggiation, tonal Swept Med – High Med – Fast No HighNavigatrics Device tagged objects manipulation, frequency tag Facilitator(Pardue & (rings) in sequence control + reader helps explainParadiso, proximity to objects to record, mapping to2002) reader overdub, playback, players

and control tempo

Musical Sound, 1–5 Players wear Trinkets launch Swept Med – High Med – Fast No HighTrinkets Device tagged objects MIDI notes and frequency tag Facilitator(Paradiso (rings) in effects (pitch shift, reader helps explainet al., 2001) proximity to vibrato, etc.) mapping to

reader players

Piano Cubes Sound, 1–2 Players tilt jar Players control Tilt sensors, Low Fast No High(Robson, Device interface tempo and pitch of Accelerometer2001) arpeggios

Resonance Image, 1–4 Players use mouse Limited Buttons, Mouse Medium Fast No Medof 4 Sound, to select sounds(Iwai, 1994) Device

Rhythm Sound, 1–50 Players hit custom MIDI triggered vocal Drum Pads Low Fast No HighTree9 Lights, urethane drum samples, percussive(Paradiso, Device pads sounds & word1999) fragments

Sound Sound, 1–4 Players roll and tilt Players control GPS, tilt, High Med – Fast No HighMapping Device suitcases in zones timbre, pitch and Accelerometers Facilitator(Mott & where music is rhythm helps explainSosnin, 1997 mapped to mapping to

players’ motion players.and location

Speaking Sound, 1–8 Players wave Midi triggered Photo-resistors Low Fast No HighOrbs Device hands over orbs samples(Ask, 2001) interrupting light

beams

Squeezables Sound, 1–3 Players squeeze, Players control DSP, FSR’s, High Fast No High(Weinberg Device twist and pull manipulate timbre, Potentiometers, Needs& Gan, foam balls to melody, pitch & Variable supervision2001) control musical filters resistors in novice

output setting

Stretch Sound, 1–4 Players press latex Players control Potentiometer + Low Fast No High(Robson, Device rubber membrane pitch, volume, timbre latex Membrane2001) and filters

8 MidiBall was created in 1991.9 Rhythm Tree was created in 1996.

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 8: Collaborative Musical Experiences for Novices

Collaborative musical experiences 417

instances, people usually gather around an instrument at aspecific location and play together. Because they are co-located, players can see each other’s gestures and morereadily understand the relationship between each player’sactions and the sounds produced. However, if the sounds arenot easily attributable to specific actions or devices, thenplayers must find other ways to communicate. As will be discussed in Section 4, Beatbugs (Weinberg, Aimi, & Jennings, 2002) (Fig. 4), Musical Trinkets (Paradiso, Hsiao,& Benbasat, 2001) (Fig. 8), and SoundMapping (Mott &Sosnin, 1997) (Fig. 9), all work around this issue in a varietyof ways.

With the growth of the Internet, a new genre of networkedcollaborative interfaces has appeared. In these systems,players communicate over a network from non-specific loca-tions, from virtually anywhere in the world. Although this isan evolving and exciting area, it is beyond the scope of thisarticle and will be left for future research.2

3.3 Media

A number of collaborative interfaces combine audiovisualelements as a way of enhancing communication and creatingmore meaningful experiences. In Table 2, we have indicatedthe use of sound and image as elements of the interaction.We list these media in order of importance in the designprocess and as focus elements of system playability. The useof visual imagery can facilitate the collaborative experienceby reinforcing the responsiveness of the system to players’actions. However, visual imagery can also distract playersfrom seeing other players’ actions, or from attending to auralelements, or both. Some of the systems that include visualimagery as the primary medium include Jamoworld (Blaine& Forlines, 2002), Jamodrum (Blaine & Perkis, 2000) (Fig. 7), Iamascope (Fels & Mase, 1999) (Fig. 6) and Cur-rents of Creativity (D’Arcangelo, 2001).

3.4 Scalability

By their very nature, collaborative interfaces are designed fora minimum of two or more players. However, the number ofplayers greatly influences the types of interfaces and musicthat is appropriate. An interface built for two people is gen-erally quite different from one built for tens, hundreds orthousands of players. When considering scale, factors suchas turn-taking protocols and gesture-sound correspondencesshift as the number of players increase. For example, it doesnot make sense to expect turn-taking protocols to emerge inan interface with three hundred drum pad inputs distributedthrough a large area, as embedded in the RhythmTree struc-ture (Paradiso, 1999). Directly refuting this notion is theMidiBall (Jacobson, Blaine, & Pacheco, 1999) (Fig. 1) inter-

face, where only a few people are physically able to hit theball at one time, even if hundreds or thousands of people arepresent.

3.5 Player interaction

Each player in a collaborative instrument has his or her ownset of controls. Although the control devices may be identi-cal or different for each player, the underlying method ofinteraction is quite often the same. If all players have identi-cal interfaces and are able to identify their sonic contribu-tions, then it becomes more obvious what other players aredoing to cause a response from the system. Once this hasbeen achieved, each player should (theoretically) understandthe other players’ gestures and potentially have a strongersense of connection with them, as well as an appreciation oftheir skills. This can lead to a more relaxed environment andmore spontaneous group behaviors, since new players canimmediately learn from each other. For our examples, wehave focused mainly on interfaces that are identical for eachplayer.

3.6 Musical range/Notes

The most common technique used to provide an easilylearned interface is to limit the range of notes/sounds thatany action creates. This is consistently achieved by restrict-ing the players’ opportunities for extended musical explo-ration by keeping the sounds euphonic. For example,providing players with short musical phrases, percussionloops, or melodies that are constrained by key, tempo orrhythm are proven methods of designing a limited range of elements that can still be satisfying and fun to play. Several experiences discussed herein, such as AugmentedGroove (Poupyrev, Berry, Billinghurst, Nakao, Baldwin, &Kurumisawa, 2001) (Fig. 3), Musical Trinkets (Fig. 8), Com-position on the Table (Iwai, 1998) (Fig. 5), and Squeezables(Weinberg & Gan, 2001) (Fig. 10), approach limiting thepotential for chaotic musical interaction between novice

Fig. 1. D’CuCKOO MidiBall.

2 For more information about interconnected musical retworks, seeWeinberg (2002).

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 9: Collaborative Musical Experiences for Novices

418 Tina Blaine and Sidney Fels

players by adding control over effect algorithms of pre-composed or algorithmically generated music. For most collaborative musical experiences we discuss, the range islimited, although several systems include real-time controlover digital signal processing (DSP) parameters.

3.7 Physical interface/sensor

Designers of collaborative interfaces can choose from anextensive selection of sensors, software and signal process-ing options. Joysticks, ultrasound, infrared, accelerometers,potentiometers, force-sensitive resistors, piezos, magnetictags, and many more sensor technologies are available tothose interested in converting voltage data into MIDI orrouting through other sound synthesis systems, such asMax/MSPTM,3 SuperCollider,4 and Open Sound World.5 Mea-suring changes in motion, light, gravity, pressure, velocity,skin conductivity, or muscle tension are just a few of the waysthat a player’s gestural input can be turned into musicaloutput. The ways in which a physical interface and sensorsare integrated are of primary importance as they provide theaffordances (Norman, 1990) that suggest what a playershould do. Thus, the type of physical interface/sensors used

is critical to the success of the system. The “squeezing” and“waving” gestures in Squeezables (Weinberg & Gan, 2001)(Fig. 10), and Speaking Orbs (Ask, 2001) (Fig. 2), respec-tively, are noteworthy examples where the affordances ofeach interface provide clear causal feedback and suggestobvious methods of interaction.

3.8 Directed interaction

Group dynamics and social interplay for novices is oftenachieved by directing the players’ interaction. AugmentedGroove (Fig. 3), Beatbugs (Fig. 4), Musical Trinkets (Fig. 8)and SoundMapping (Fig. 9) are experiences that initiallyprovide a knowledgeable person to assist the players. Anothereffective method for constraining the musical space isaccomplished through distributed leadership and turn-takingbehaviors. Beatbugs (Fig. 4) integrates different play modeswith session leaders who “pass” rhythmic motifs amongst thegroup to enable real-time manipulation and response to sonicevents. The Jamodrum (Fig. 7) software elicits a “call andresponse” behavior as a means of orchestrating the players’experience and allowing opportunities for individuals to heartheir contributions to the overall mix.

3.9 Learning curve

A key component in developing communication skill with anew device is the amount of time it takes to learn to use it.Most of the collaborative interfaces discussed herein are usedin installation settings where the expected time of play is onthe order of five to ten minutes; with the exception of Beat-bugs and Squeezables. Particularly in museum environmentswith large numbers of visitors, an interface must be obviousas to how it works immediately. On the other hand, musicalexpression usually requires a significant investment of timein order to master an instrument before subtlety can beachieved. Therefore, instruments designed for more virtuosicforms of expression are generally not learned quickly, andmost collaborative interfaces offer more rewarding musicalexperiences with practice. The evaluation of any collabora-tive instrument and its context requires an evaluation of thetradeoff between speed of learning and musical constraints.Thus, even if an interface is designed for a fast learningcurve, it still may not be immediate enough to engage anovice, potentially leading to disappointment or failure of themusical experience.

3.10 Pathway to expert performance

Ideally, a collaborative musical instrument would be initiallyeasy to learn. With the introduction of different modalitiessuch as “novice,” “intermediate,” and “expert” modes toprovide incremental levels of expression, a player can thancontinue to refine his range of musical expression. Tradi-tional acoustic musical instruments each have different entrylevels in order to become musically adept. However, they all

Fig. 2. Speaking Orbs (Ask, 2001).

3 Max/MSP is a trademark of Cycling’74, 379A Clementina Street,San Francisco, CA 94103 USA. Available: http://www.cycling74.com/4 Available: http://www.audiosynth.com5 Available: http://www.cnmat.Berkeley.EDU/OSW

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 10: Collaborative Musical Experiences for Novices

Collaborative musical experiences 419

share the capacity to provide subtle forms of musical expres-sion as players develop their skills. With collaborative inter-faces, supporting a pathway to expert performance is difficultbecause the ease of learning is often realized by restrictingthe range of musical possibilities available to the playerthrough computer-mediation. Nevertheless, it is exactly thisbroader range of musical possibilities that is necessary forexpressive expert performance. In our examples, all of thedevices have limited potential for experts except possiblyBeatbugs. Interestingly, the musical space of the Iamascopedoes not provide a pathway to expert performance but theimage space does.

3.11 Level of physicality between players (and instrument)

The availability of new sensors and computer interfaces forbuilding novel musical controllers allows the creation ofinstruments that can involve virtually every part of the humanbody including brain waves, muscle activations (Tanaka &Knapp, 2002) and tongue movements (Vogt, McCaig, Ali, & Fels, 2002). Many collaborative instruments encouragevarious levels of movement, gesture, touch, and physicalinteractions such as dancing with strangers in highly cus-tomized environments. These design strategies lay the foun-dation for developing intimate personal connections withother players and their instruments over relatively shortperiods of time, which can also lead to a sense of commu-nity. Most often, it is not the interface itself that makes foran engaging, satisfying experience, but the group ambienceand development of synergistic relationships between playersthat leads to positive communal experiences. Most examplesdiscussed herein provide a high level of physicality, with theexception of Currents of Creativity, Composition on theTable, and Resonance of 4 (Iwai, 1994), which rely on morediscrete interactions prompted by touchscreens, mice, fadersand buttons.

4. Collaborative interfaces

In this section, we examine eleven collaborative interfaces,listed alphabetically, in more detail. With each of thesedesign approaches, the more people that are involved inplaying individual notes, the faster the music tends to becomechaotic, without the introduction of directed interaction ororchestration. Some of these interfaces address this issue bynot giving direct control of individual notes to players, butonly provide control over effect algorithms and digital signalprocessing parameters of the music such as pitch shifting,volume data, filters, and so forth. As the scalability of aninstrument and subsequent density of the soundscapeincreases, the more difficult it becomes for individual players(or observers) to fully understand the interaction. This is pri-marily due to difficulty in detecting the players’ impact on the system and identifying a clear causal relationship

between specific actions triggering audio and/or visual feed-back. Accordingly, a variable learning curve often emerges,partially dependent on the number of other players, but alsoupon their willingness to interact and help each other.People’s physicality (level of physical presence and engage-ment), non-self-conscious behavior and active participationare also influenced by the playfulness, tangibility and noveltyof these devices. As a result, the reciprocal environmentappears to offset the disadvantages underlying the upwardscalability factor of these instruments. Ultimately, the excite-ment generated by engaging with groups of people playingtogether infuses the public space with a sense of community.

4.1 Augmented Groove (Poupyrev et al., 2001)

Augmented Groove (Fig. 3) relies upon human gestures tocontrol the modulation and real-time mixes of music bymoving disks that are made from vinyl LP records in theinteraction space. Twisting, shaking, tilting and moving thedisks up and down all produce different musical and graph-ical effects on loops of pre-composed music. Each playerwears an augmented reality (AR) headset that has a cameraand video display allowing animations to be overlaid on avideo image of the real disks they play with. Each disk or“glyph” has unique markings detected by the head-mountedcamera. This interface was developed to allow people to DJinteractively without using dials, sliders and LPs. In essence,this approach “levels the playing field” between expert DJsand non-musicians, since everyone is a first time player. Themotions of the players’ records control filters, effects andsamples dynamically mixed in and out of the groove. Withinthis collaborative DJ environment, players are required tobecome quite physical with each other by collectively arrang-ing their disks within the camera’s range and physicallymaneuvering within that space to affect the overall soniccomposition. While the act of moving a disk is relativelysimple, depending on the type of effects applied to the soundand the number of other players, it is not always obvious toparticipants what their individual contribution is to theoverall mix. The AR gear allows players to see the controlamplitude of each record overlaid on the real disk but not themusical quality, that is, what each record controls. Within theAR environment, players can see each other, though focus isusually on the playing space where only players’ hands andthe disks are seen. Also, players’ eyes are covered by the ARgear which inhibits facial-gesture based interaction. In spiteof the musical ambiguity, players dance around enjoying theunique experience of being a virtual DJ and being able tomanipulate sound through their actions both individually andas a group.

4.2 Beatbugs (Weinberg et al., 2002)

The Beatbugs are hand-held locally networked percussivetoys that allow up to eight players to create collaborativecompositions by sharing and modifying rhythmic motifs.

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 11: Collaborative Musical Experiences for Novices

420 Tina Blaine and Sidney Fels

Players enter rhythmic patterns by hitting an embedded piezosensor and manipulate pitch, timbre, and rhythmic elementsby bending two antennae-like sensors (Fig. 4). Random turn-taking behaviors are integrated into two of the three interac-tion designs developed for this interdependent series ofinstruments. The “Drum Circle” mode requires a sessionleader to conduct and facilitate the process of creating,embellishing and layering a series of interlocking polyrhyth-mic patterns for each participant. The “snake” interactiondesign adds another layer of complexity as players exchangerhythmic patterns and also modify DSP parameters of themusic.

Week-long workshops are held in various cities to teachchildren how to play with the Beatbug devices, ultimatelyleading to a final performance. Children are initially taughtin a free play improvisational mode where they have moreindividual control of timing their rhythmic patterns. In per-formance, Beatbug compositions are quantized in order tomaintain an overall musical cohesiveness. The young per-formers physically arrange themselves onstage in snake-likeconfigurations. Once the leader has established a rhythmicmotif and hits their beatbug to “send” the rhythm, the com-puter automatically “passes” the rhythmic pattern to anotherplayer at random. The receiving player becomes the new“leader” and has the option to further manipulate the motifvia the bend sensor antennae or may choose to “keep” thetransformed motif and create a new rhythmic pattern to transmit within the group.

This method of real-time musical composition and trans-formation coupled with the random transmission betweenplayers introduces an aspect of spontaneity and surprise intothe turn-taking actions. Visual feedback to augment theplayers’ rhythmic exchanges is reinforced by a flashing ring

of LED’s. From the audience’s perspective, it is unclearwhether the types of gestures that players initiate and the sub-sequent sounds provide a clear causal relationship. It couldbe argued that the subtleties of these musical interactionsappear to be more well-suited for the participants than for anaudience. It is also difficult to determine how much controleach player has over her individual input and what parame-ters she is able to manipulate, even though each Beatbug hasits own speaker to provide spatialized sound for the players.Ultimately, this leaves the observer with an enjoyable butsomewhat ambiguous perspective regarding the collaborativeperformance. At this time, the Beatbugs are not robustenough to be used in an uncontrolled setting and require anintensive training period. The designers believe that in spiteof the pre-composed algorithmic generation of events, there

Fig. 3. Augmented Groove (Pouprev et al., 2001). Left image shows player, right shows how to play with system.

Fig. 4. Beatbugs (Weinberg et al., 2002).

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 12: Collaborative Musical Experiences for Novices

Collaborative musical experiences 421

is a path toward mastery of these instruments by learning torefine player control of the bend sensors (Weinberg, personalcommunication, Nov. 2002).

4.3 Bullroarer (Robson, 2001)

Inspired by one of the first known instruments, the Bull-roarer, an electronic aerophone was invented to emulate thephysical and sonic properties of this analog device in thedigital realm. The focus of this interactive musical explo-ration was to create a simple interface for non-musicians. Inthe design of a traditional bullroarer, a player spins a hollowobject at the end of a rope to create sound, so an analogousseries of electronic controllers were developed to allowpeople to “spin” together. Experimenting with a variety ofways to affect the collective sound space led to manipulatingdrone sounds, playing back varied tempos and mixingsequences of notes. The speed of a person spinning producesother sonic responses including the layering of effects, suchas echo, to create timbral changes to the music. Anotherapproach let each of three bullroarers’ whirling speeds con-trolling the rhythmic density and playback of synchronizedrhythm and bass tracks. All of these design methods severelylimit the players’ control over specific notes. Nevertheless,the physically engaging and simple action of spinning withup to three other players and a collective impact on theoverall soundscape gives players the illusion of enoughcontrol to sustain interest and playability. Robson alsocreated a collaborative interface called Stretch (Robson,2001). This interface encouraged people to hit latex wallpanels with sensors mounted underneath to create a varietyof sounds. Observing multiple players with the Bullroarerand Stretch devices led Robson to the following conclusion:“Unlike musicians who will approach the objects as musicalinstruments that demand being mastered, the naïve usersrespond better to the playful qualities of the objects.”

4.4 Composition on the Table (Iwai, 1998–1999)

Composition on the Table uses a large horizontal projectionsurface with a light grid on the display, as shown in Figure5. At each node of the grid a button allows the player tochange the direction of the arrow associated with the node.A colored ball of light moves along the lines of the grid andthen follows a path decided upon by the direction of thearrow at each node it encounters. When a node is hit, a spe-cific MIDI note plays. There are four lights moving at dif-ferent speeds. Thus, with careful selection of the direction ofthe arrows, complex loops and rhythms can be created. Thetable is large enough for multiple people to change the pat-terns. Working with multiple players allows very complexsounds that cannot be made as effectively by a single person,since there are too many lights active at the same time.

Composition on the Table reveals many characteristics thatcontribute to a highly successful collaborative instrumentand experience. The interface is very simple: buttons that

have the affordance to be pressed. Also, the arrow under eachbutton provides clear feedback about the state of the node,so players can anticipate the sounds created. The interface isfun and game-like as players are challenged to keep the lightsmoving to make interesting sounds. The piece severely limitsthe notes that players can play with. Essentially, pitches arefixed and it is up to the players to make the arrows point inthe directions they want to get sound. Effectively, players arecontrolling rhythm and creating looping constructs. Playersdo not have to be constantly interacting with the piece as the music is a function of the autonomous movement of thelights. This lack of physicality is in contrast to many of theother examples we discuss.

The music from the work is not intended for an audience.Rather, participants play with the lights to indirectly affectthe underlying musical process. The sound and lights moti-vate players to work the buttons together to get the lights tomove in interesting ways. Also, because the process contin-ues at a predictable speed, players can anticipate upcomingsounds from the lights, making it more fun. Composition onthe Table is an excellent example of how a very simple inter-face with restricted musical output creates a successful col-laborative interface. The underlying musical process is easy to understand for all players, providing rapid entry for novices. The sound output is aesthetically appealing asplayers compose complex rhythms from the lights moving at different speeds. Finally, though there is only a limitedupward path for expert performance, the system sustainsrewarding group interplay.

4.5 Iamascope (Fels & Mase, 1999)

The Iamascope is an interactive, electronic kaleidoscope thatcombines computer, video, graphics and audio technology forparticipants to create striking imagery and sound (Fig. 6). Inthe installation, the players take the place of a colorful pieceof floating glass inside a kaleidoscope, and simultaneouslyview a kaleidoscopic image of themselves on a large screenin real time. The Iamascope uses a single video camera as

Fig. 5. Composition on the Table (Iwai, 1998–1999).

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 13: Collaborative Musical Experiences for Novices

422 Tina Blaine and Sidney Fels

input at the base of the projection screen. Anything or anyonein front of the camera is captured and turned into a large kalei-doscopic image using multiple reflections of a small extractof the video image. By applying image processing to the multicolored visuals, participants’ body movements directlycontrol music in parallel with changes to the image. The imageprocessing uses simple intensity differences over time, calcu-lated in real-time. The intensity differences are grouped intoten active zones in the interaction space in front of the pro-jection screen that activate sound when the player moves.Movement in each zone is mapped to a sound like a ten-stringguitar; players’ gestures above a certain threshold cause asingle note to play that is similar to stroking a string. However,unlike a guitar, the computer changes the chords periodicallyaccording to pre-specified melodies.

Having the computer select the notes of correspondencewithin each zone ensures that melodic soundsets are created.Thus, novices can very easily create harmonic musical pat-terns. This works well in an installation setting where peoplespend approximately five to thirty minutes, but rarely focuson improving their musical output. Quite often, playersbecome adept at controlling the complex imagery in a shorttime. The visual fascination that results tends to render themusical aspects of the Iamascope more as an accompanimentthan a focal point of the experience.

Because the music responds virtually to each player’s ges-tures, it is sometimes difficult for them to realize that theyare actually making the music. Occasionally, players thinkthat prerecorded music is started and stopped by their move-ments, but that actual control of the note pitches is not pos-sible. Because the aesthetics of control through the use ofsound is not that well supported, the musical output does notappear to promote the collaborative aspect of the experience.In contrast, the visual imagery provides a very strong controlaesthetic and flow develops quickly; essentially, the Iamas-cope is a video based “fun mirror” with some interestingmusical accompaniment. Nevertheless, the elements com-bine for all the participants to quickly learn to play togetherpromoting a dynamic, collective encounter.

The interaction space in front of the projection screen islarge enough for two to four people. When used collabora-tively, players equally tend to either stand side-by-side or onein front of the other. The kaleidoscopic array shows theimages created by all the players, resulting in everyone’sattention being directed to the projection screen. People gen-erally experience the image in two modes; either they see thewhole image as a Gestalt image due to the symmetry, or theyfocus in on the small piece of the kaleidoscope that corre-sponds to the image taken by the video camera and ignorethe rest of the reflected whole. Players can easily switchattention to see what the other players are doing, while at thesame time perceive the overall effect. This capacity stronglysupports a collaborative experience for everyone, althoughthe focus tends to be more visual than aural. Completestrangers have been observed dancing together in the Iamascope.

While Iamascope is intended for novices, the use of videobased interfaces have seen use in both novice and expert ori-ented interfaces. For experts, the work of Palindrome Inter-media Performance Group6 uses a video based system totrack performers to control sound. Likewise, Very NervousSystem (Rokeby, 2002) also maps video imagery to sound,though is not usable as a collaborative interface since itdepends upon a single person in the image to work properly.

4.6 Jamodrum (Blaine & Perkis, 2000) and Jamoworld (Blaine & Forlines, 2002)

A custom table known as the Jamodrum is a seven-foot diameter circular projection surface with embedded drumtriggers that people gather around and play as a shared in-strument (Fig. 7). Computer graphics are projected onto theJamodrum’s tabletop interface to conduct a digital drumcircle with pulsing imagery and sound. Rhythmic patterns

Fig. 6. Iamascope (Fels & Mase, 1999).

Fig. 7. Jam-O-Drum (Blaine & Perkis, 2000) photo credit: ElaineThompson.

6 Palindrome Inter-Media Performance Group: http://www.palindrome.de.

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 14: Collaborative Musical Experiences for Novices

Collaborative musical experiences 423

supported by flashing arrow indicators point to combinationsof 3, 6, or 12 subsets of players around the table’s perimeterto virtually conduct a call and response rhythmic experiencethat prompts players to take turns with others. Designing a“Simon Says” approach within a directed compositionalform is intended to help players discover the alternating playand listen modes. This call and response turn-taking modewas also introduced as a way to provide enough structure for novices and still offer improvisational opportunities forexpert players. Although a rhythmic pattern is “suggested,”players are not restricted by a predetermined cycle of eventswithin the directed compositional form. Particularly for first-time players, the Jamodrum affords opportunities for posi-tive musical experiences with others that help guide thegroup toward rhythmic entrainment and synchronous play.

The potential for chaos arises mainly as a function of thenumber of people playing at one time. The more players arepresent, the more difficult it becomes to coordinate enthusi-astic drumming with the identification of unique soundswhile listening to others gathered around the instrument aswell. A contributing factor to people’s focus, or lack thereof,is the sound pollution caused by close proximity to otherexhibits in a museum setting. In these instances, the resul-tant group behavior shows that not being able to hear the calland response patterns causes players to hit pads harder, andinduces detachment in relation to the imagery. As a result,Jamodrum participants often spontaneously create andorchestrate their own drum circles, completely ignoring theintended interaction. Regardless of virtual and/or humanmediation, with scalability up to twelve players, some ofthese jams are more rhythmically in sync than others. Obser-vations of novice musical behavior on the Jamodrum ledresearchers to the following suggestions for promising futuredevelopment in facilitating group interaction in a public envi-ronment (Blaine & Forlines, 2002):

• Introduce more game-like musical interaction;• Create goal-oriented or directed activities that would

encourage more communication and social interactionbetween players;

• Explore an orchestrated approach to the music and visualsin order to avoid chaotic interaction;

• Integrate controllers/input devices that might disassociateplayer’s expectations regarding responsiveness;

• Make the active areas/input devices on the table more discrete;

• Design interactions with direct relationship to player’sactions.

To further explore these areas of research, a next-generationJamodrum was developed. Providing players with both aMIDI drum pad and turntable as inputs enables additionalcontrol of projections and sound via the movement of thedisks, in addition to making the four player stations more distinct and conducive for team interplay. The immersiveaspects of the musical gaming environment are accentuatedby surround sound and 3D computer graphics projected

onto the walls of the exhibit space, thereby creating a“Jamoworld.”

In an attempt to design a coherent musical structure to support the “CircleMaze” experience, the selection ofsampled loops and musical backing tracks during each levelof game-play is determined by the collective positioning ofthe players’ disks. Players may be unaware that they are incontrol of subtle changes in the collage of musical accom-paniment as the visual focus of the game itself has a signif-icant impact on aural perception. Auxiliary samples triggeredby hitting the drum pads provide players with more imme-diate audio feedback and a clearer identification of theirmusical contribution. Players are generally more attuned totheir sonic contributions from the drum pads and not as awareof their impact on the musical accompaniment after thebacking tracks are engaged. Essentially, the path to musicalproficiency with this combination of continuous and discretecontrol methods is severely limited to create goal-orientedand directed musical game-like activities that enhance com-munication between the players. Ultimately, restricting theplayers’ control over musical elements is enacted to achievea more sociable environment.

4.7 MidiBall (Jacobson, Blaine, & Pacheco, 1993)

Not many interfaces are capable of supporting musical inter-action between thousands of people in one physical space,but D’CuCKOO’s MidiBall was an experiment in audienceinteraction on a massive scale (Fig. 1). Also built on themodel of turn-taking, the 1.5m diameter MidiBall bouncedbetween audience members in venues with up to 20,000people as a way of providing a shared collaborative interface.As people hit the MidiBall, their actions triggered sampledsounds and real-time computer graphics that became part ofan interactive concert experience. By tracking radio fre-quencies and converting the signals into MIDI note on/noteoff data, this very simple action of bouncing a giant ball fromperson to person created an instant bond between audiencemembers and the band playing live onstage. Clearly, theMidiBall was designed to be accessible to as many people aspossible without the expectation of developing a highlyexpressive mode of expert play. The appeal for D’CuCKOO’saudience was the novelty of the interface and feeling of com-munity realized through the affordance of spontaneous participation in a collective audiovisual experience.

4.8 Musical Trinkets (Paradiso, Hsiao, & Benbasat, 2002)

Musical Trinkets allows players to combine up to sixteensmall toy-like objects to control music and sound (Fig. 8).Players hold or wear the trinkets and collectively maneuverthem over a shared horizontal projection tabletop surface inorder to mix their sounds. Each trinket’s identification, prox-imity, and orientation relative to the surface, control variousaspects of the underlying musical process. Some of the con-

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 15: Collaborative Musical Experiences for Novices

424 Tina Blaine and Sidney Fels

trols assigned to each device include which sound (timbre),amplitude, timing and notes are heard. The parameters ofeach device are measured through an embedded identifica-tion tag, using a receiver sensor on the table. This method oftag tracking is similar to that of a shoplifting security system.The magnetic tags load a reader coil that emits an FM chirpthat spans one decade of frequency (40–400kHz) and deter-mines the unique properties of each tag. The wearability andportability of the trinkets enhance the appealing, toy-like aes-thetic that naturally provides a playful and engaging collab-orative experience. However, having many people, eachplaying different trinkets, mapped to a variety of musical ele-ments can make the sound space chaotic and confusing. Thisis due to players sometimes being unaware of which audioelements they control and being unsure as to what soundsother players are triggering, although practice can help.Using objects that visually suggest what sound is affected orintegrating turn-taking behaviors may assist in creating amore complementary musical environment.

The interesting complexity of Musical Trinkets comesfrom being able to easily combine different sounds by simplyadding a new device to the mix. Another playful aspect isthat some of the trinkets are fashioned as rings that can beworn on players’ fingers. Thus, a player can wear up to fivetrinkets on each hand and play by twiddling and drumminghis fingers. This is an excellent affordance, so people knowwhat to do gesturally to make sound. However, configuringthe musical mapping so that the sound is satisfying is a chal-lenge, since the control of each object is highly correlated.Further, feedback from the trinkets depends on sound andproprioception also reducing the range of expression possi-ble. In keeping with our observations about collaborativeinterfaces, the inventors note, “After living with our MusicalTrinkets environment for several months, it has becomeobvious to us this mapping stays at too basic a level. It hintsat possibilities for virtuosic performance, but despite thevariety of objects available, it often stays in too simple a

sonic space” (Paradiso et al., 2001). The next phase of this research called Musical Navigatrics addresses a morecomplex musical interaction using a similar interface (Pardue& Paradiso, 2002).

4.9 Rhythm Tree – Brain Opera (Paradiso, 1999)

The Rhythm Trees are a percussive sculpture composed ofthree hundred translucent rubber drumpads. In a variation onthe turn-taking behavior, players trigger vocal samples, percussive sounds and word fragments by hitting a pressure-sensitive piezo-electric strip implanted inside each drumpad.This results in a kaleidoscopic array of sound, light andimagery in response to each player’s drumming. Not sur-prisingly, with the ability to rhythmically respond to hun-dreds of players, the collective musical output of the RhythmTrees is a bit difficult to interpret as well. Although groupsof up to 15 musicians have had rewarding jams on theRhythm Tree, exuberant real-time drumming with novicesmore often than not results in chaos. As evidenced with theJamodrum, drumming itself tends to encourage hard and fastplaying as a means of determining an individual’s sonicoutput. In the case of the Rhythm Tree, this is even more dif-ficult because the polyurethane pads have protrusions thathurt the hands when hit too forcefully. While the urethanepads are robust enough to withstand abusive playing, overtime, this design technique will either induce softer gesturesor cause players to move on. It is also important to bear inmind that the engaging aspect of this sonic sculpture is notabout musical performance for an audience. Instead, theRhythm Trees are intended to provide opportunities for large-scale, improvisational interaction between participants.Although the experiences tend to be primarily cacophonous,people very much enjoy spontaneously beating the RhythmTree and engaging in the communal experience together.

4.10 SoundMapping (Mott & Sosnin, 1997)

SoundMapping is a mobile soundscape produced by people’sinteractions with each other and the architecture of their dailyenvironment, using four sonically-retrofitted suitcases asdescribed in (Bean, 1999) (Fig. 9). The inventors collabo-rated on this project using GPS, gyroscopes and custom-builtodometers to sense the movement of participants. The intentwas to evoke sound characteristics that directly correspondto the participants’ physical orientation by algorithmicallymapping sound linked to a particular region (i.e., a waterfountain evokes “watery” sounds). In an effort to strike abalance between tonal and musical freedom, the timbres,pitches and rhythms of each suitcase’s voice are programmedto be dependent upon the amount of interaction between participants. Participants invariably become performers byspontaneously “playing” their suitcases with other groupmembers, to generate complex musical compositions basedon the sum of each individual’s performance. Overall soundis produced in response to people’s movement and location,

Fig. 8. Musical Trinkets (Paradiso et al., 2001).

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 16: Collaborative Musical Experiences for Novices

Collaborative musical experiences 425

without pre-composed sequences. This attaches an illusion toSoundMapping that the emphasis of the music evolves fromthe physical exploration of space, rather than from a passivereception of pre-programmed events. Ultimately, the greatestentertainment value of this experience is not derived fromdetermining which sounds each player is responsible for, oreven the extreme physical interaction required in order toplay together to create a collaborative musical composition.SoundMapping focuses entirely on exploring the sonic capabilities of the instrument through active movement,engaging with other players and interacting with an audienceof curious onlookers, using musical suitcases as a mediumfor communication.

4.11 Squeezables (Weinberg & Gan, 2001)

In an effort to create interdependent multiplayer musicalinteractions, the Squeezables takes an “organic” approach tocontrolling sound by sensing multiple axes of synchronousand continuous gestures (Weinberg & Gan, 2001) (Fig. 10).As the name implies, Squeezables are malleable instrumentsdesigned to provide new ways to physically shape and manip-ulate musical sounds and textures by measuring the exertionof force and pressure on flexible objects. Initially intendedfor two-handed collaboration between multiple players,force-sensitive resistors embedded in foam balls are mappedcompositionally to combine one lead melody with five com-plementary musical elements that change parameters, suchas timbre and pitch, interdependently. In a later iteration ofthe Squeezables, a set of six retractable gel balls attached toa platform allow three players to apply filters and changes intimbre to pre-composed music via squeezing, twisting andpulling motions. Pressure sensors implanted in the balls incombination with variable resistors mounted under the table-top are used detect player input. Collectively, the six ballsprovide up to twelve simultaneous channels of player inputthrough their continuous pulling and squeezing gestures.This sensor data is converted into MIDI signals and sub-sequently routed through a Max/MSP application to apply

digital signal processing, such as frequency modulation,filters, resonance and low frequency oscillators to the inter-dependent algorithms assigned to each ball. Some otherapproaches to overall musical control for novices includerephrasing melodies, manipulating the timbre of solo instru-ments, fading voices in and out, and so forth. “. . . the instru-ment can offer expressive and intuitive musical experienceswithout requiring a long learning process, virtuosic perfor-mance skills, or an analytical knowledge of music theory”(Weinberg & Gan, 2001).

The emergence of tangible interfaces brings with it thecapacity to “sculpt” physical computing technology. Cou-pling these interfaces with computer-generated music pro-vides a particularly conducive environment for high levels ofphysical interaction between players. Logistically, however,the Squeezables cannot be made robust enough to withstandintense manipulation in a public space without a controlledenvironment including supervision. To date, this has beenundertaken only in performance situations. Because Squeez-ables are not wireless, the cabling could easily be dismantledor disrupted during play and must be redesigned if used forpublic exhibition. In spite of these physical frailties, thestrong interdependency between players and spatial instru-ments affords fun, tactile opportunities to explore and navi-gate through different aspects of music.

5. Summary

The development of collaborative musical interfaces andexperiences is just beginning to emerge as a field of research,and this is an attempt to begin a discourse regarding multi-player systems. Since the authors are not aware of all systemsunder development, this body of work is not all-inclusive.7

Fig. 9. SoundMapping (Mott & Sosnin, 1997). Fig. 10. Squeezables (Weinberg & Gan, 2001).

7 Due to space and time considerations, we have left out discussionof many excellent systems.

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 17: Collaborative Musical Experiences for Novices

426 Tina Blaine and Sidney Fels

From the range of collaborative musical interfaces exploredherein, it seems clear that the overriding similarity in devel-oping for novices is that the overall experience takes prece-dence over the generation of music itself. Music and soundare still important aspects of the experience, but the abilityto control individual notes, harmonies and melodies is notnecessarily the most important factor to a non-musicalperson in determining whether or not an interface is engag-ing. The opportunities for social interaction, communication,and connection with other participants is of paramountimportance to the players’ comfort with the interface. Ulti-mately, this will lead to a sense of community, even withstrangers, in a public setting. While the affordances of thesensors and interface should be transparent to the players,understanding their individual impact on the system is criti-cal. This can be achieved through the use of music, lights,images, sound effects, or a wide range of other possibilities.Essentially, anything that supports feedback of the intentionsof the players will serve to reinforce the perception of ahighly responsive system.

Providing novices with easily accessible music makingexperiences is more important than having a complex inter-face with built-in, upward capability for virtuosic expression.The doctrine, Principles for Designing Computer MusicControllers, aptly states human and artistic principles:“Instant music, subtlety later,” and “Make a piece, not aninstrument or controller,” (Cook, 2001). This ideology holdstrue particularly for collaborative musical instruments. Thecounter-argument proposed by (Wessel & Wright, 2002) isthat a low entry fee should have no ceiling on virtuosity. Theyposit that “. . . many of the simple-to-use computer interfacesproposed for musical control seem, after even a brief periodof use, to have a toy-like character and do not invite contin-ued musical evolution,” (Wessel & Wright, 2002). While thisis fundamentally true for expert musicians, the main opposi-tion to this viewpoint regarding novice interplay is that the demographic for most multiplayer instruments are non-musicians and therefore, the same principles do not neces-sarily apply. Naturally, expert musicians are more concernedwith expressive capabilities and mastery of their instruments,but novices do not have any understanding of what thismeans. Furthermore, it is unlikely that first time players haveexpectations of becoming expert players on any musicalinstrument, so why should that expectation be inherent in thedesign of collaborative musical instruments or experiences?

There are perhaps a number of reasons why we see somany collaborative interfaces catering to the novice player atthis point in time. First, novel computer-based music con-trollers for expert play have been difficult to achieve, eventhough the potential for truly innovative instruments exists.Second, the use of the computer as a means of mappinggesture and sound provides the opportunity to intermediatesupport to the music generation process. This is particularlyhelpful for first-time players who are also the target demo-graphic for most multi-person systems in public environ-ments. Finally, affordable computing and the widespread

availability of sensor technologies have been driving forcesbehind the development of highly interactive, collaborativeenvironments that integrate sound and imagery.

Music is a powerful expressive medium. It can be used toexpress deep emotional affect in the hands of a master. Musiccan also be a mechanism for bonding new communities,including complete strangers who may not even speak thesame language. Musical interface designers are only justbeginning to understand and explore the affordances musicbrings to enhance this collaborative design space. We see abright future for interfaces that focus on communicationthrough sound. We anticipate that as our appreciation andawareness of musical interface design improves, a host ofexciting instruments and experiences will be created for pro-fessionals and novices to collectively share in the power ofmaking music.

Acknowledgements

We appreciate the many constructive comments from thereviewers that helped focus this article. The authors wouldalso like to thank members of the Human CommunicationTechnologies (HCT) laboratory at the University of BritishColumbia and Carnegie Mellon University’s EntertainmentTechnology Center for their support. We also express grati-tude to Joe Paradiso and Sile O’Modhrain, the specialeditors, for providing encouragement and feedback duringthe writing process. Funding for this work comes from ATRMedia Integration & Communications (MIC) research labo-ratory in Japan and the Natural Sciences and EngineeringResearch Council (NSERC) in Canada.

References

Ask, E. (2001). Speaking orbs – Interactive multi-participantsound sculpture. Demonstration at the 1st Workshop on NewInterfaces for Musical Expression (NIME01), ACM SpecialInterest Group on Computer-Human Interfaces, Seattle.USA, Apr. 1–2, [On-line]. Available: http://www.nime.org.

Bean (1999). Sound Around Town. Electronic Musician Magazine, June, 134–140.

Bischoff, J., Gold, R., & Horton, J. (1978). MicrocomputerNetwork Music. Computer Music Journal 2, 24–29.(Reprinted in C. Roads and J. Strawn, (eds.) 1985. Founda-tions of Computer Music. Cambridge, Massachusetts: MITPress.)

Blaine, T., & Perkis, T. (2000). The Jam-O-Drum interactivemusic system: A study in interaction design. DIS2000 Conference Proceedings, New York, NY, August, 165–173.

Blaine, T., & Forlines, C. (2002). JAM-O-WORLD: Evolutionof the Jam-O-Drum into the Jam-O-Whirl gaming interface.In Proceedings of the 2nd International Conference on NewInterfaces for Musical Expression (NIME02), Dublin,Ireland, May 24–26, 17–22.

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 18: Collaborative Musical Experiences for Novices

Collaborative musical experiences 427

Chadabe, J. (2002). The limitations of mapping as a structuraldescriptive in electronic instruments. In Proceedings of the 2nd International Conference on New Interfaces forMusical Expression (NIME02), Dublin, Ireland, May 24–26,Keynote-2-i-v.

Cook, P. (2001). Principles for designing computer music con-trollers. In Proceedings of the 1st Workshop on New Interfacesfor Musical Expression (NIME01), ACM Special InterestGroup on Computer-Human Interfaces, Seattle, USA, Apr.1–2, [On-line]. Available: http://www.nime.org.

Costikyan, G. (2002). I have no words & I must design: Towarda critical vocabulary for games, In Proceedings of ComputerGames and Digital Cultures, Tampere, Finland, June 6–8,9–33.

Csikszentmihalyi, M. (1990). Flow: The Psychology of OptimalExperience, New York: Harper Perennial.

D’Arcangelo, G. (2001). Creating Contexts of Creativity:Musical Composition with Modular Components. In Pro-ceedings of the 1st Workshop on New Interfaces for MusicalExpression (NIME01), ACM Special Interest Group on Com-puter-Human Interfaces, Seattle, USA, Apr. 1–2, [On-line].Available: http://www.nime.org.

Dudon, J., & Arfib, D. (2002). Photosonic disk performance,Concert Performance at the 2002 International Conferenceon New Interfaces for Musical Expression (NIME02),Dublin, Ireland, May 24–26.

Fels, S., & Vogt, F. (2002). Tooka: Exploration of two personinstruments. In Proceedings of the 2nd International Confer-ence on New Interfaces for Musical Expression (NIME02),Dublin, Ireland, May 24–26, 116–121.

Fels, S., & Mase, K. (1999). Iamascope: A graphical musicalinstrument, Computers and Graphics, 2, 277–286.

Fels, S., Gadd, A., & Mulder, A. (2002). Mapping transparencythrough metaphor: Towards more expressive musical instru-ments, Organized Sound, (in press).

Globokar, V. (1973). Laboratorium: for 10 Instruments, Peters,(sound recording).

Gresham-Lancaster, S. (1998). The aesthetics and history of thehub: The effects of changing technology on network com-puter music, Leonardo Music Journal, 8, 39–44.

Iwai, T. (1998). Composition on the table. exhibition at Millen-nium Dome 2000, London, UK.

Iwai, T. (1994). Resonance of 4 [On-line]. exhibition, 1994,Available: http://www.iamas.ac.jp/~iwai/artworks/resonance.html, accessed on Oct 18, 2002.

Jacobson, L., Blaine, T., & Pacheco, C. (1993). Time for tech-nojuju, New Media Magazine, January, 18.

Kreimeier, B. (2000). Formal Abstract Design Tools [On-line],Gamasutra website: Available: http://www.gamasutra.com/features/20000413/kreimeier_02.htm.

Machover, T. (1996). Brain opera. In Memesis: The Future ofEvolution. Ars Electronica Editions, Linz, Austria.

McGrath, J.E., & Hollingshead, A.B. (1994). Groups inter-acting with technology. Thousand Oaks, CA: SAGE Publications.

Möller, C. (1997). Audio Grove [On-line], Exhibition: Spiral ArtCenter, Tokyo, May. Available: http://users.design.ucla.edu/

projects/arc/cm/cm/staticE/page8.html accessed on Nov. 7,2002.

Mott, I., & Sosnin, J. (1997). Sound Mapping: An assertion ofplace [On-line]. Interface’97: Available: http://www.rever-berant.com/SM/paper.htm accessed on Oct 18, 2002.

Norman, D. (1990). The design of everyday things. New York:Currency/Doubleday. 79–80.

Orio, N., Schnell, N., & Wanderley, M. (2001). Input Devicesfor Musical Expression: Borrowing Tools from HCI. In Pro-ceedings of the 1st Workshop on New Interfaces for MusicalExpression (NIME01), ACM Special Interest Group on Com-puter-Human Interfaces, Seattle, USA, Apr. 1–2, [On-line].Available: http://www.nime.org.

Paradiso, J., Hsiao, K., & Benbasat, A. (2001). Tangible MusicInterfaces using Passive Magnetic Tags. In Proceedings ofthe 1st Workshop on New Interfaces for Musical Expression(NIME01), ACM Special Interest Group on Computer-Human Interfaces, Seattle, USA, Apr. 1–2, [On-line]. Avail-able: http://www.nime.org.

Paradiso, J. (1997). Electronic music interfaces: New ways toplay. IEEE Spectrum Magazine, 34, 18–30.

Paradiso, J. (1999). The Brain opera technology: New instru-ments and gestural sensors for musical interaction and per-formance. Journal of New Music Research, 28, 130–149.

Pardue, L., & Paradiso, J. (2002). Musical Navigatrics: NewMusical Interactions with Passive Magnetic Tags. In Pro-ceedings of the 2nd International Conference on New Inter-faces for Musical Expression (NIME02), Dublin, Ireland,May 24–26, 168–176.

Poupyrev, I., Berry, R., Billinghurst, M., Kato, H., Nakao, K.,Baldwin, L., & Kurumisawa, J. (2001). Augmented RealityInterface for Electronic Music Performance. In proceedingsof the 9th International Conference on Human-ComputerInteraction (HCI International 2001), August, New Orleans,LA, USA, 805–808.

Robson, D. (2001). PLAY!: Sound Toys For the Non-Musical.In Proceedings of the 1st Workshop on New Interfaces forMusical Expression (NIME01), ACM Special Interest Groupon Computer-Human Interfaces, Seattle, USA, Apr. 1–2,[On-line]. Available: http://www.nime.org.

Rokeby, D. (2002). Very Nervous System [On-line]. installa-tion, Available: http://www.interlog.com/~drokeby/vns.html,1991, accessed on Nov. 8, 2002.

Stewart, J., Bederson, B., & Druin, A. (1999). Single DisplayGroupware: A Model for Co-present Collaboration, In Pro-ceedings of the Special Interest Group on Computer HumanInteraction (SIGCHI’99), Pittsburgh, PA, USA, 286–293.

Stockhausen, K. (1964, 1965). Mikrophonie 1–2. Music of ourTime Series, CBS Records (sound recording, LP).

Tanaka, A., & Knapp, R.B. (2002). Multimodal interaction inmusic using the electromyogram and relative positionsensing. In Proceedings of the 2nd International Conferenceon New Interfaces for Musical Expression (NIME02),Dublin, Ireland, May 24–26, 43–48.

Ulyate, R., & Bianciardi, D. (2001). The interactive dance club:Avoiding chaos In A multi-participant environment. In Pro-ceedings of the 1st Workshop on New Interfaces for Musical

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014

Page 19: Collaborative Musical Experiences for Novices

428 Tina Blaine and Sidney Fels

Expression (NIME01), ACM Special Interest Group on Com-puter-Human Interfaces, Seattle, USA, Apr. 1–2, [On-line].Available: http://www.nime.org.

Verplank, B. (2001). A course on controllers. In Proceedings ofthe 1st Workshop on New Interfaces for Musical Expression(NIME01), ACM Special Interest Group on Computer-Human Interfaces, Seattle, USA, Apr. 1–2, [On-line]. Avail-able: http://www.nime.org.

Vogt, F., McCaig, G., Ali, A., & Fels, S. (2002). Tongue “n”Groove. In Proceedings of the 2nd International Conferenceon New Interfaces for Musical Expression (NIME02),Dublin, Ireland, May 24–26, 60–64.

Waisvicz, M., & Balde, F. (1976). CrackleStage, Crackle Tea-Party [On-line]. Available: http://www.xs4all.nl/~mwais/CrackleChain.htm andhttp://www.xs4all.nl/~mwais/Crackle%20family.htm,accessed on Nov. 8, 2002.

Weinberg, G., Aimi, R., & Jennings, K. (2002). The beatbugnetwork: A rhythmic system for interdependent group col-laboration. In Proceedings of the 2nd International Confer-ence on New Interfaces for Musical Expression (NIME02),Dublin, Ireland, May 24–26, 107–111.

Weinberg, G., & Gan, S. (2001). The Squeezables: Toward anexpressive and interdependent multiplayer musical instru-ment. Computer Music Journal, 25, 37–45.

Weinberg, G. (2002). The aesthetics, history, and futureprospects of Interdependent Music Networks. InternationalComputer Music Conference (ICMC 2002), Göteborg,Sweden, September 16–21, 349–356.

Wessel, D., & Wright, M. (2002). Problems and prospects forintimate musical control of computers. Computer MusicJournal, 26, 11–22.

Dow

nloa

ded

by [

Ond

okuz

May

is U

nive

rsite

sine

] at

12:

13 0

6 N

ovem

ber

2014