56
Agents in Live Coding Improvisation Ushini Attanayake A report submitted for the course COMP4560 Advanced Computing Research Project Supervised by: Ben Swift The Australian National University May 2019 c Ushini Attanayake 2019

New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Agents in Live CodingImprovisation

Ushini Attanayake

A report submitted for the courseCOMP4560 Advanced Computing Research Project

Supervised by: Ben SwiftThe Australian National University

May 2019c© Ushini Attanayake 2019

Page 2: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Except where otherwise indicated, this report is my own original work.

Ushini Attanayake31 May 2019

Page 3: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Acknowledgments

This thesis would not be possible without the people who assisted me along the way. Iwish to extend my thanks to the following people:

• To my supervisor Ben Swift, thank you (times three) for your mentorship, adviceand encouragement. Most importantly, I want to thank him for shaping the wayI think about problems in the field of computer music and more broadly, humancomputer interaction (HCI).

• To Henry Gardner and Kieran Browne, who took the time to discuss my ideas inthe early stages of the project.

• To Charles Martin, who helped me navigate through the computer music litera-ture.

• To the Creativity/Code/Culture Research Group at the Australian National Uni-versity, who offered me a platform to discuss my ideas and gave me valuablefeedback.

• To the participants who volunteered their time to participate in this research.

• To Weifa Liang, who organised and conducted weekly meeting to teach thenecessary research methods for this coursework.

• To my parents, who unconditionally support everything I choose to persue.

iii

Page 4: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed
Page 5: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Abstract

When a musician improvises, they increase the creativity in their music by beingattentive to changes in their environment. With the hopes of reaching new levels ofcreativity, the field of human-computer musical interactions introduces computationalagents into the musicians environment to interact with the musician and stimulatetheir creative process. In doing so, we run the risk of drawing creative control awayfrom the musician. We also introduce the possibility of the musician generating novelmusic under the constraints of the agent. I aim to investigate the how best to distributecontrol between the musician and the agent over the audio output of a performance inorder to maximise creative stimulation. To assist me, I have developed a computationaltool which offers an agent to be used in the Extempore live coding environment. Theagent has two versions; one offers a high level of control to the musician, the otheroffers high control to the agent. Both version of the agent were used in live codingperformances and the amount of creative stimulation experienced by the performerswas evaluated. Four out of six participants found the agent which gave them lesscontrol more angaging to interact with. I believe this was due to the fact that theparticipants were required to think inventively under the constraints of the assertiveagent, which lead to more moments of creativity.

v

Page 6: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

vi

Page 7: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Contents

Acknowledgments iii

Abstract v

1 Introduction 11.1 Improvising with Agents . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Control and Creativity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Thesis Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2 Background 52.1 Contributing Factors to Creativity . . . . . . . . . . . . . . . . . . . . . . 52.2 Human-Agent Musical Interaction . . . . . . . . . . . . . . . . . . . . . . 62.3 Algorithmic Composition . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.4 Pattern-Based Live Coding . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3 Design and Implementation 113.1 The Computational Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113.2 Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143.3 The Agent . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153.4 Suggestive vs. Assertive . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

4 Experimental Methodology 254.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254.2 Experiment Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254.3 Experiment Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . 264.4 Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

5 Results 295.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295.2 Approach to Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295.3 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

6 Conclusion and Future Work 356.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

vii

Page 8: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

viii Contents

Bibliography 37

Appendix 39.1 Appendix A: Project Description . . . . . . . . . . . . . . . . . . . . . . . 40.2 Appendix B: Study Contract . . . . . . . . . . . . . . . . . . . . . . . . . . 41.3 Appendix B: Study Contract . . . . . . . . . . . . . . . . . . . . . . . . . . 42.4 Appendix C: Software Description . . . . . . . . . . . . . . . . . . . . . . 43.5 Appendix D: README . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

Page 9: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

List of Figures

1.1 High Level System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3.1 System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123.2 Suggestive Agent: Interaction Timeline . . . . . . . . . . . . . . . . . . . 133.3 Assertive Agent: Interactive Timeline . . . . . . . . . . . . . . . . . . . . 133.4 Pattern Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143.5 3 State Markov Chain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163.6 Agent Modifying Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . 183.7 Pattern Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193.8 Inversions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203.9 Thin Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213.10 Repeat Section . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

ix

Page 10: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

x LIST OF FIGURES

Page 11: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

List of Tables

xi

Page 12: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

xii LIST OF TABLES

Page 13: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Chapter 1

Introduction

A musician begins an improvisational set. They are free to manipulate their instrumentto translate ideas as they are conceptualized. Now suppose an intelligent computa-tional agent is introduced to this scenario, to encourage the musician to explore newideas. The agent is able to manipulate the instrument just as the musician is ableto. How much control should the musician have over the agent’s output in order tooptimally stimulate creativity? Is the musician able to generate more novel ideas whenthey are in control of the agent’s output, or when they are highly constrained by theagent?

The inherent spontaneity and openness to interaction in improvisation makes ita more creative activity than its calculated counterpart, composition. External ele-ments, such as other musicians or the audience, can affect the musical landscape inwhich the musician operates. For example, a pianist might hear a phrase in a trum-pet player’s solo and echo it on the piano. The way a musician chooses to react tochanges in their environment can influence the level of inventiveness during theirperformance. Jordanous et al. [Jordanous and Keller, 2012] found this to be the casewhen they investigated which elements of an improvisational performance made itan environment which facilitated creativity and the musician’s ability to interact withtheir surroundings emerged as a primary attribute which encouraged creativity.

1.1 Improvising with Agents

There are numerous examples in the computer-music literature where computationaltools are used to drive changes in this musical landscape and intelligent agents arehardly strangers in this corpus of work. Computational tools can be made accessibleand can be far more efficient at computing musical possibilities than humans. Thispaves exciting prospects in the development of novel forms of music in co-creativehuman-computer systems.

An agent’s level of musical intelligence is a contributing factor to its success atstimulating creativity. Similarly, the role of the agent in the performance, whether isoccasionally suggests changes to the musician or provides an accompaniment, can also

1

Page 14: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

2 Introduction

influence creative stimulation. Both of these factors have been explored at depth inpast literature. However, there is little work exploring how the distribution of controlbetween the agent and the musician stimulate creativity. Even a highly sohpisticatedagent is of little use to an improvising musician if the musician is given too much ortoo little freedom. Stiking the right balance of control between agent and musicianover the musical output is extremely important, not only in the context of music, butin the broader context of interactive agents in the arts. As intelligent agents becomemore prevelant and sophisticated in the arts, it becomes increasingly important to findways in which they can be used to enhance human creativity.

1.2 Control and Creativity

This project aims to investigate how the level of control a musician has over an agent’smusical output affects the amount of creative stimulation the musician experiencesduring an improvisational set. The specific context of improvisation considered hereis live coding; a practice where musicians write code to generate music in real-time.In order to do this, I have developed a computational tool to be used alongside theExtempore [Sorensen and Gardner, 2017] live coding environment. The tool providestwo versions of the same agent with which the live coder can interact. The ’suggestive’version merely suggests changes to the coder in the form of high-level code structurescalled patterns, the ’assertive’ version directly executes changes to the audio output.My goal for this investigation is to compare how well each version of the agent stimu-lates creativity during an improvisational set. Both versions of the agent were used inimprovisational sets by a group of participants who were later interviewed. These ses-sions were transcribed and a grounded-theory-based approach was applied to evaluatethe results. I have made the following contributions to this study:

• Designed and implemented a Markov Chain which modifies Extempore patterns.

• Modified the Extempore VS Code extension to include the handling of TCPconnections to the agent and implemented the functions which distinguish howa user perceives changes from the two different version of the agent.

1.3 Thesis Overview

To situate this research in the wider context of human and agent co-creativity, I willbegin by introducing the some background in the areas of creativity, human and agentmusical interactions, algorithmic composition and live coding. This will contentextu-alise the design decisions I made as we go on to discuss the implementation detailsof the computational tool. The implementation chapter moves through increasing de-grees of complexity starting with a high level overview of the tool and finishing with acomparison of the suggestive and assertive versions of the agent. In the final sections

Page 15: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§1.3 Thesis Overview 3

of this thesis, I will outline the experimental methodolgy and present my findings. Thereport will conclude with proposed extensions to the project in a ’future work’ section.

Page 16: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

4 Introduction

Live Coder

Text Editor

ExtemporeCompiler

Agent

Extempore Environment

1 |(:>A40(playsyn1@160dur)'(60575348))2 |(:>A40(playsyn1@180dur)(rotate'(7272))3 |(:|V20(playsyn1@150dur)'(_487263))

Writes CodeReads and Modifies Code

Figure 1.1: High Level System

Page 17: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Chapter 2

Background

2.1 Contributing Factors to Creativity

It is evident, from an inspection of the literature around creativity in improvisation,that there is a lack of concensus on a set of factors which best contribute to creativity.On one side, Biasutti and Frezza [Biasutti and Frezza, 2009] propose anticipation,emotive communication, flow, feedback and use of repertoire while on the other sideJohnson-Laird ’s claims improvisation should be [Johnson-Laird, 2002] âAŸNovel forthe individual, Optionally novel for society, Nondeterministic, dependent on Crite-ria or constraints, and based on Existing elements in order to be creative. Internalfactors are proposed like cognitive function identified by Boden [Boden, 2004] whileCsikszentmihalyi [Csikszentmihalyi, 1998] proposes external factors like interactiveenvironments. Jordanous et al. [Jordanous and Keller, 2012] attempted to make anemprical comparison of these factors by finding the most common words which appearin this literature using NLP techniques and conducting a user survey. Its was foundthat the ability to communicate and interact was one of three primary factors whichstimulate creativity, the other two being skill and emotional engagement. Though amusician’s skill can help them enter flow states more readily, their level of skill willnot change much in the moment of an improvisation. Since interaction was identifiedas a contributing factor, a natural question which follows is, how much interactionis required to optimise creative stimulation? This is the broadest form of the ques-tion my research is trying to investigate. Since the objective of this investigation is tocompare the degree to which the assertive and suggestive interactive agent stimulatecreativity, its important to develop a yardstick for creative stimulation. Kleinmintzet al. [Kleinmintz et al., 2014] utilised divergent thinking as a measure of creativity,specifically, creative ideation. Divergent thinking is exhibited by a musicians ability toexplore or generate multiple ideas while convergent thinking involves focussing on thedevelopment of a single idea. This measure of creativity will be used in later chapterswhere the results are evaluated.

5

Page 18: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

6 Background

2.2 Human-Agent Musical Interaction

When the field of improvisational creativity whispered possibilities that external inter-action stimulate creativity, the field of human-agent musical interaction shouted backwith an uproar of interactive musicalagents. As mentioned in the previous chapter,these interactions can be classified into roles an agent can play in a human-computerco-creative system. Lubart [Lubart, 2005] provides a comprehensive set of roles for theagent. These are

1. ’The Nanny’: who takes on mundane tasks such as presenting and saving infor-mation.

2. ’The Pen Pal’: a facilitator of infromation flow between musicians in an ensembleor between the musician and the audience.

3. ’The Coach’: who suggests techniques to stimulate creativity

4. The Colleague’: who is capable of being creative and generating music alongsidethe musician.

Thom’s Band Out of Box (BoB) [Thom, 2003] is an example of an agent who playsthe role of a colleague. BoB trades 4 bars of solos with the musician and The agenthas been designed to learn the style of the musician it is collaborating with. In thissystem, the musician has very little control over the agent’s output. The only controlthey have over the agent is the audio output they generate which the agent learnsfrom. In constrast to the single role the agents in BoB embody, Lewis’ Voyager [Lewis,2000] encompasses several roles. It is a multi-agent "virtual improvising orchestra"which can analyse a musician’s performance in real-time and plays alongside themusician, both responding to them and generating its own independent ideas. This isyet another example of a listening agent which gives the musician little control. Lewisstresses the importance of asking "where does the musician’s creativity lie?" ratherthan "Can a machine be creative?" in his paper. However, we know from Jordanous etal. that interactions between agent and musician stimulate creativity. We also knowthat the agent is likely driving the interaction since the musician has little control. Thismakes it unclear whether a dominating agent, in terms of control in an interaction,is able to stimulate creativity more than an agent with less control in the interaction.Rowe’s Cypher [Rowe, 1992] gives the human little more control over the outputthan Voyager. Cypher has a listening component and a playing component. Usersinteract with Cypher by feeding in streams of MIDI data to the listening componentof the system. The listening component then classifies features it hears and sendsthis information to the playing component. The user is additionally able to configurehow the playing component of the system reacts to the messages it receives from thelistener. The musician’s ability to constrain the way the player interprest messagesfrom the listener give the musician more control in the interaction than the agent.

Page 19: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§2.2 Human-Agent Musical Interaction 7

Is it possible that this dynamic results in more creative stimulation than what wasseen with Voyager? A study done by Haught [Haught, 2015] suggests that increasingconstriants on a creative activity will lead to more creative stimulation. In his study, hehad a group of participants construct a creative sentence about some entity. For someparticipants, a noun was used to describe the entity, for the other participants, a linedrawing was given of that same entity. Haught found that those who were given linedrawings constructed more creative sentences. Haught argues that the vague nounrepresentation has the effect of softly constraining the participants ideas while thedetailed line drawing has the effect of highly constraining their ideas. He concludesthat increasing the constraints on a creative task improves the creative output.

This idea of constraint relates closely to the idea of control. The greater the numberof constraints put on an improvisational set, the less control the musician has over thepiece. In the examples of agents discussed until this stage, how the musician’s level ofcontrol over the agen’t output affects creative stimulation is not investigated. Martinet al. [Martin et al., 2016] are among the first to look at how different level of controlaffect the creative stimulation in improvisation. They developed a system for ensembleimprovisation where members of the ensemble generate music by interacting withiPad touch screens. The system also includes a networked button and an intelligentagent, both of which are capable of generating changes to the GUI which in turnchanges the mapping from the interactions to the audio output. The difference beingthat the changes form the networked button are triggered by the ensemble memberswhile the agent triggers changes autonomously. They evaluate the system in 4 differentconfigurations; without the button or the agent, with the button, with the agent andwith the agent and the button. It was found that the mixed configuration with boththe button and the agent lead to the best musical output and enjoyment by the users.The configuration with just the button was rated the second most enjoyable, howeverthe configuration with the agent lead to longer sessions which could imply higherengagement on the musician’s part. Martin et al. have created a system which leavesthe majority of the music making to the musicians with the external interference fromthe button and the agent merely driving changes in the user interface. However, it isdifficult to conclude whether the results can be directly attributed the different levels ofcontrol the button and the agent give the user. This is because it is not clear whether thesame methods are used to generate changes which are triggered by the button pressesand the agent. This justifies the importance of conducting an investigation on theeffects of control on creativity while keeping the method used for generating a changeconsistent across all control settings. Brown, Gifford and Volts [Brown et al., 2017]also approach the idea of developing a creatively stimulating partnership betweenhuman and computational agent in the context of duets. The software they developed,CIM, is an activity-based model for interaction. This model uses input from a humanperfromer as a basis to form its own output in a duet. Though musician’s found theact of interacting CIM highly engaging, the only way they could control the output of

Page 20: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

8 Background

the agent is through their own musical output. This is an indirect form of control andBrown et al. acknowledge that including more direct forms of control for the musician,such as user-controlled parameters for the agent’s output could make the partnershipmore stimulating.

2.3 Algorithmic Composition

For the agents discussed in this chapter, the methods they employ to generate musiccome from the field of algorithmic composition. The types of models in algorithmiccomposition can be broken down into 3 categories, though there are instances of hy-brids between categories. The first is a rule-based model. The output generated bythese model are deterministic and are defined using a formal grammar. An exampleof a model is an automata. Rule-based models are not appropriate for stimulatingcreativity since the changes generated by the model can eventually become predictable.Then there are stochastic models. These models are probabilistic and therefore are non-deterministic. An example of a stochastic model is a Markov Chain. The probabilisticnature of the model makes it a good contender for improvisational performances. Thedownside with stochastic models is that it can be difficult to construct a probabilisticrepresentation for the rules of music theory. How do you define the probability of akey change? This is where the third model comes in, artificial intelligence (AI). Mostof the agents discussed in this chapter are AI models. These models are able to learn arepresentation of music theory, so they develop probabilistic understanding of music.Ames [Ames, 1989] looks back throught the history of Markov Chains to find proper-ties of the model which have proven to be usesful in applications. Ames identifies thatthough the Markov Chains are an outdated and relatively unsophisticated model inalgorithmic composition, their simplisity and therefore efficincy make them a desirablemodel for computer music applications in real-time composition. A notable example ofMarkov Chains being used in composition is a series of comositions by Iannis Xenakis[Xenakis, 1971] in 1959 where he used several Markov Chains to dictate successionsof large-scale events in his composition. Though AI models are capable of generatingstate-of-the-art results in the musicality of their output, to design and train an AI modelon an esoteric dataset of Extempore code would be outside of the scope of this project.Therefore, I have opted for a simpler model of a Markov Chain which allowed me tofocus on comparing the effects of the suggestive and assertive control settings of theagent.

2.4 Pattern-Based Live Coding

The agent I have developed interprets and modifies Extempore code. Specifically, theagent interprets a code structure available in Extempore called a pattern. Patterns areefficient and high level constructs which allow users to implement a looped component

Page 21: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§2.4 Pattern-Based Live Coding 9

into their music with one line of code. There are a couple of other live coding languageswhich offer compact high-level structures like patterns. One example is ixi lang [ixisoftware, 2015]. This language is built on top of SuperCollider [McCartney, 1996],allowing musicians take make use of samples and built in synth in SuperColliderwith the convenience of a simple, high-level code structure. Gibber [Roberts, 2012] isanother example of a browser-based live coding environment which uses high-levelabstraction, though not as concise as Extempore patterns. High-level support in livecoding languages are very important since there is a trade-off between fine-grainedcontrol and efficient evaluation in any live coding performance. It is particularly usefulfor people who are new to live coding, since verbose parts of the language can pose agreater learning curve. The benefits of having an agent which interprets and modifiesshort, high-level functions are twofold. The simple structure of a pattern in Extemporemakes the task of extracting features from the pattern much more straightforward thanif the agent was to used a raw audio signal. It also make live coding more accessibleto those without a strong background in coding. There are live coding environmentswhich have been introduced to agents in the past. Collins [Collins, 2010] introducesa machine-listening agent to the SuperCollider language. However this work focuseson machine-listening which can inhibit the exploration of the effects of control oncreativity. Referring back to Brown et al.s work, machine-listening too could limitcreative stimulation due the indirect form of control the musican will have over theoutput.

Page 22: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

10 Background

Page 23: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Chapter 3

Design and Implementation

The computational tool I developed is an agent which will be used alongside theExtempore live coding environment. The aim was to develop an agent which is able tointeract with a live coding musician during a performance and have different degreesof invasiveness in its interactions. When introducing a computational agent into anyenvironment, we must identify the way the agent will interact with its environmentand which changes in the environment invoke a reaction from the agent. To do this,we open this section with a high-level description of the computational tool and howit behaves in the live coding environment. Then we will go on to discuss the designand implementation of the tool in greater detail. To facilitate the primary goal of thisinvestigation, it is also important that the agent interacts with its environment withdifferent levels of invasiveness. The more invasive an agent’s actions are, the lesscontrol the live coder will have over it. Giving the agent the ability to make indirectand direct changes to the audio output allowed me to investigate how much creativestimulation was generated. The details of this are covered in the closing section of thischapter where we compare the suggestive and assertive versions of the agent.

3.1 The Computational Tool

There are two high-level components to the computational tool I have developed. Theseare the agent and the text editor extension. The computational tool operates withinthe Extempore live coding environment, which is comprised of three components; theExtempore compiler, the live coder’s text editor and the live coder. With the help ofa diagram, I will describe the functions of each of these components at a high-leveland outline the way they interact with each other. We will first look at the Extemporeenvironment. The live coder writes code in the form of Extempore patterns in theirtext editor. When the live coder wishes to execute a pattern and incorporate it intothe audio output, the pattern is sent to the Extempore compiler via the Extempore texteditor extension. The Extempore compiler then compiles the pattern and incorporatesit into the audio output.

11

Page 24: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

12 Design and Implementation

Figure 3.1: System

The live coder can also request changes to a pattern from the agent. They firstconfigure the agent to the version they wish to interact with, assertive or suggestive.When the live coder wishes to request a change to a pattern form the computationaltool, the text editor extension component of the tool reads the pattern associated withthe request and send it to the agent component of the computational tool. The agentmodifies this pattern and returns a modified version of the pattern to the text editorextension component of the computational tool. Depending on the version of the agentthe computational tool is configured to, the text editor extension will either replace theold pattern with the new pattern in the live coder’s editor in the case of the suggestiveagent, or it will send the pattern to the Extempore compiler to directly execute thechange in the case of the assertive agent.

Having covered the high-level components of the computational tool, I will nowdiscuss how the agent generates a modification to a pattern in detail. The structureof patterns was instrumental in influencing the design of the agent component of thecomputational tool, so I will lead into the design and implementation details withcloser look at what a pattern is.

Page 25: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§3.1 The Computational Tool 13

Figure 3.2: Suggestive Agent: Interaction Timeline

Figure 3.3: Assertive Agent: Interactive Timeline

Page 26: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

14 Design and Implementation

3.2 Patterns

An Extempore Pattern is comprised of several parts. We will examine its structure bydisecting an example of a pattern. The first section of a pattern specifies the name ofthe pattern, timing parameters, which instrument to use and the volume parameter.The second section of the pattern specifies a list of MIDI pitches. In the examplebelow, the pitches represent a descending F triad ’C A F C’. Evaluating this pattern inExtempore will trigger a loop which plays through the entire list of MIDI pitches ineach iteration. The first timing parameter in the pattern, 4, specifies that the entire listof pitches should be played over 4 beats, this means the pitches are played on beats1,2,3 and 4 respectively.

Figure 3.4: Pattern Example

While the example above explicitly specifies which MIDI pitches to play, this canalso be done implicitly by replacing the ’list’ section of the pattern with any functionwhich returns a list of MIDI pitches.

Reversed

(:> A 4 0 (play syn1 @1 60 dur) (reverse ’(60 57 53 48)))

Random Rotate

(:> A 4 0 (play syn1 @1 80 dur) (rotate ’(60 57 53 48) (random ’(-3 -1))))

The first example with the reverse function is equivalent to the pattern(:> A 4 0 (play syn1 @1 60 dur) ’(48 53 57 60)) and the second example will rotatethe list by a random degree between -3 and -1 in each iteration of the loop. Addingnested lists within the main list of MIDI pitches changes the rhythm the notes areplayed in. In the example below, the entire list is played over 4 beats, but pitch 60 isplayed on beat 1 and 57 is played on beat 1.5 while pitches 53, 48 and 50 are played onbeats 2, 3 and 4 respectively.

Nested Lists

(:> A 4 0 (play syn1 @1 60 dur) ’((60 57) 53 48 50))

Page 27: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§3.3 The Agent 15

Patterns also allow live coders to input chords using ’#’ followed by a list of notes tobe played simultaneously. Here the pitches 60, 57 and 53 will be played simultaneouslyon the first beat while pitches 48, 50 and 49 will be played on beats 2,3 and 4 respectively.If the live coder wished to add rests to the list, they can do so by adding the ’_’ symbol.

Chords

(:> A 4 0 (play syn1 @1 60 dur) ’(#(60 57 53) 48 50 49))

Rests

(:> A 4 0 (play syn1 @1 60 dur) ’(60 _ 48 50))

Nested lists, rests and chords are only a subset of the features available in patterns.These are the features which the agent is capable of interpreting within a pattern andI chose to include this specific subset because the ability for the musician to changerhythm and harmony is essential in any musical performance.

The well-defined, compact structure of patterns makes it easy to parse and extractthe relevant information an agent will need to generate a modification to the pattern.Moreover, we can generate interesting alterations to the sound of a pattern, by applyingfunctions to the pattern which are localized around the list section. This means themajority of the pattern can remain unchanged. These are advantages patterns offerwhich make them the ideal code-structure to build a model for the agent around.

3.3 The Agent

At its core, the agent is modeled by a Markov Chain. A Markov Chain is a stochasticmodel with a set of states used in early algorithmic composition years [Ames, 1989].The model can transition from one state to another and each transition has a probabilityassociated with it. The transition probabilities depend on both the current state andthe destination state. That is, the probability from transitioning from state A to state Bis not necessarily the same as the probability of transitioning from state C to B. Givena current state, the Markov Chain transitions to a destination state by sampling fromthe probability distribution over all states to select its destination state.

This design works particularly well with the structure of patterns because variousfunctions can be applied to the list in a pattern. Therefore the states in the agent’sMarkov Chain model are represented by various functions which can be applied tolists in Extempore. The agent’s Markov Chain model is comprised of five states andtheir corresponding functions are:

• Reverse

• Random Rotate

• Thin

• Repeat Section

Page 28: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

16 Design and Implementation

• Invert

The agent stores the original pattern sent to it from the live coder. A request fora change from the live coder cause the agent to transition to a new state and applythe function corresponding to the state to the original pattern. The probabilities in thetransition matrix were chosen by me and modified until I felt they generated enoughvariation. The live coder is able to either request a change to the pattern in the agent’sstorage or to a brand new pattern. In the latter case, the pattern in storage will beupdated to the new pattern.

This is not an exhaustive list of functions which can be applied to patterns, butthese functions were chosen because they represent common transformations whichare applied to melodies in music. These functions can drive rhythmic and melodicchanges and as a set, they are capable of generating a reasonable amount of variationin the music. More complex operations such as transposition, which require theagent to know more information about the key the piece is in, were ommitted fromconsideration to limit the scope of the project. However, we discuss some ways toimplement such feature in the ’Future work’ section of the final chapter.

Figure 3.5: 3 State Markov Chain

The agent is implemented in a python script. The agent receives patterns from thetext editor extension in the form of a string through a TCP connection. First, the agentconducts a few vailidity checks to verify the string is a valid pattern. This involvesparsing the string for a ’play’ substring, matching parentheses and an ’@’ symbol. Ifthe user requested a change to the pattern in the agent’s storage, the string will insteadread ’change’. If the string is a valid pattern, the data is parsed to extract the ’list’component of the pattern. As stated earlier, we can apply arbitrary functions to the

Page 29: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§3.3 The Agent 17

’list’ component of the pattern. Since it is infeasible to accomodate all every functionwhich could appear in this section of the pattern, including once the live coder writesthemselves, the agent is restricted to interpreting patterns where the list component isof the form ’list’ keyword (list 60 63 72) or with the ’(60 63 72) At this stage we alsocheck if any of the five functions in the Markov Chain appear in the list componentof the pattern. These are also acceptable forms for a pattern and the initial state ofthe agent can be assigned accordingly. If the list does not contain any functions in theMarkov Chain, the initial state is assigned to one of the 5 states at random.

Page 30: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

18 Design and Implementation

Figure 3.6: Agent Modifying Pattern

Page 31: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§3.3 The Agent 19

For the reverse and random rotate functions, the pattern in the agent’s storage istransformed bysimply adding function calls within the pattern to the reverse, rotateand random Extempore functions. Below are examples of a pattern being modified toinclude a random rotate and a reverse.

Original(:> A 4 0 (play syn1 @1 60 dur) ’(60 57 53 48))

Reversed(:> A 4 0 (play syn1 @1 60 dur) (reverse ’(60 57 53 48)))

Random Rotate(:> A 4 0 (play syn1 @1 80 dur) (rotate ’(60 57 53 48) (random ’(-3 -1))))

The remaining functions, thin, repeat and invert updated elements in the list ofmidi-pitches. Since the list of pitches can contain nested lists, chords and rest, Iconstruct a tree representation of the list. This allows us to efficiently update elementsof the structure while preserving any rhythmic implications of nested lists.

An n-ary tree representation is constructed for the pattern such that, for subtreeswhich represent nested lists, all nodes on the same level have the same beat values. Toillustrate this, lets consider the pattern(:> A 4 0 (play syn1 @1 60 dur) ’(#(c4 eb4) #(d4 bb4 d5) f4 (ab4 (c5 f5))). The cor-responding tree representation can be found in figure 3.4. Here, the entire list of pitchesis played over 4 beats, the first two chords, the pitch ’f4’ and the last group of nestedpitches each begin playing on beats 1,2,3 and 4 respectively. Moving down one levelin the nested list’s subtree, the pitch ’ab4’ is played for a 1/2 beat beginning on beat4 while the pitches ’c5’ and ’f5’ in the next level down are played for a 1/4 of a beatbegnning on beat 4.5 and 4.75 respectively.

Figure 3.7: Pattern Tree

Page 32: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

20 Design and Implementation

Each node in the tree has a node value and a list of children. The node valuescan be one of three types; an opening parenthesis ’(’ value indicates the children ofthat node are elements of a nested list, a ’#’ value indicates that the children of thatnode belong to a chord and should sound simultaneously and a pitch value indicatesa leaf node. Such a node will have an empty list of children. When a new pattern issent to the agent, a tree is constructed by parsing the list component of the patternand recursively creating a new level to the tree each time an opening parenthesis isencountered. Since an new opening parenthesis could be encountered at any stage ofthe traversal, the agent keeps track of the which preceeded the opening parenthesis sothat the can be added to the tree when the matching closing parentheses have beenfound and the recursive function rolls backwards.

If the live coder requests a change to the existing pattern in the agent’s storage, thetree is not reconstructed or modified. If, however, the invert function is applied to theexisting pattern, the pitches in the tree will be updated accordingly.

Figure 3.8: Inversions

For the implementation of the thin function we select a random subset of the pitchesto replace with rests in the tree. This is done recursively by choosing a random set ofnodes at each level of the tree and calling a recursive ’add_rests’ function on each ofthe selected nodes. The only cases where a node is replaced with a rest is if the nodesignifies the beginning of a chord or it signifies a pitch. For example, in the tree in thefigure 3.5, suppose that in the first recursive step, the function selects two of the rootnode’s children to recurse on. Suppose the two indices are randomly chosen to be 0and 3. Since the 0th child of the root symbolises a chord, it is not sensible to add reststo elements of the chords, so we replace the subtree rooted at this node with a leaf nodewith node value ’_’. The 3rd child of the root is a nested list, so we recursively calladd_rests on this node. In the second recursive step, suppose the function randomlychoose to replace the 0th child with a rest. Since this node is a single pitch ’ab4’, wewill replace its node value with a rest value ’_’.

Page 33: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§3.3 The Agent 21

Figure 3.9: Thin Function

Page 34: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

22 Design and Implementation

The final function which modifies the pitches in the list is the repeat section function.This function randomly selects two nodes from the root node’s children list and flattensall the subtrees between these two nodes into a list. This list is then insert into thepitch list right before the node with the smaller index out of the two that were chosen.This is best shown with an example.

Figure 3.10: Repeat Section

Once the Markov Chain applies one of these functions to the list of pitches, the newlist is stiched back together with the original parameter section of the pattern and sentto the text editor extension through the TCP connection.

3.4 Suggestive vs. Assertive

Once the agent generates a pattern, it must be executed to hear the change in theaudio output. The responsibility of executing the pattern can either be left to thecomputational tool or the live coder. We thererby have two natural levels of controlwhich arise from this system which can be used to investigate the effect of control

Page 35: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§3.4 Suggestive vs. Assertive 23

on creative stimulation. These levels of control are presented to the live coder in theform of two vesrions of the agent; the suggestive version and the assertive version.The suggestive agent represent a human-agent interaction where the live coder holdthe majority of control over the agent’s output. The live coder can modify the patternproposed by the agent, the can choose to evaluate the pattern or ignore it by deletingthe change. The assertive agent lies on the other end of the spectrum of control betweenhuman and agent and it gives more control to the agent. This version does not show thelive coder what the change generated by the agent was. The agent-generated patternis sent directly to the Extempore compiler at which point it is incorporated into theaudio output. In the suggestive version of the agent, the text editor extension replacesthe old pattern with the new pattern in the text editor while with the assertive versionof the agent, the pattern is not replaced in the editor. The only way the agent canrevert the change it is by changing the ’>’ symbol at the beginning of the pattern to apipeline symbol ’|’ in their editor. This stops the pattern from sounding in the audiooutput. Since the live coder can repeatedly request changes from the suggestive agent,the can move back through the history of the the patterns generated by the agent withthe ctr + z command.

This is best seen in the form of a video demo. CLICK HERE to view.

Page 36: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

24 Design and Implementation

Page 37: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Chapter 4

Experimental Methodology

4.1 Overview

In this section, we will outline the design for the experiments. Each experiments willbe in the form of a music-making session which is followed by an interview. We willbriefly discuss the environment the experiment was conducted in and the procedurefor each session. In order to maintain anonymity of the participants, we will refer tothe individual participants as p1, p2, p3, p4, p5 and p6.

4.2 Experiment Design

Each participant participated in 3 experiments. In the first half of each experiment,the participants are asked to spend anywhere between 10-30 minutes making musicwith Extempore. The second half of the experiment involvs an informal interviewwhere participants were asked about their experience during the session. In their firstexperiment, each participant will be introduced to the Extempore environment andpatterns. This experiment will involve them making music without an agent and willbe treated as a control. This is to get the participants up to a baseline compitencyand level of comfort making music with Extempore patterns. There will be a smalltutorial constructed for this experiment. The tutorial introduces some basic Extemporefunctions which can be applied within the pattern as well as how to evaluate a patternfrom the text editor. The functions introduced in the tutorial include all functionswhich the agent is capable of applying. it is important to introduce all the patterns theagent was capable of outputting so that their experience from the agent isn’t influencedby the fact that they are being introduced to new functions. The agent will however beusing the functions in ways which have not been explicitly introduced to the participantin this tutorial. This will be very short and most of the time will be spent making music.For each session with the agent, a small instruction guide was presented outlining thelimitations of the patterns which can be sent to the agent and how the participant canrequest changes to a pattern from the agent. The tutorial from the first session was

25

Page 38: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

26 Experimental Methodology

also presented in this session if the participant’s want to refer to it.The experiments were conducted as a series of sessions where the participants

will spend some time making music without the agent engaged, with the suggestiveagent engaged and the assertive agent engaged. Each sessions was upto 30 minutesand will be followed by a 15min informal discusison. However, the participants wereencouraged to at least continue making music for 10 minutes. In the second session,participants p1, p4 and p6 will be introduced to the assertive agent while participantsp2, p3 and p5 will be introduced to the suggestive agent first. This is to prevent anybias arising from the participants having more experience with Extempore when theyuse the second version of the agent they encounter. During the interview componentof the first session, I will aim to get an understanding of the participant’s musicalbackground and gain a sense of how they approach the music-making session. I alsowant to see id participants have prior experience with live coding/live music andExtempore. Particularly, I want to see how much experimentation there is in eachparticipant’s first session. This gives me a baseline against which I can compare theamount experimentation from the sessions with the agents. Notes were taken duringsession to track the level of activity through out the session and these notes were alsoused to ground some of the questions in the interview. These were subject notes aboutwhat changes they made how often they made changes. The time between sessionswas not fixed.

4.3 Experiment Environment

In order to create an environment which somewhat resembled a live performancewhich also isn’t too high-pressure to live coders with no previous experience, I in-formed the participant’s that I would be listening to the music they were makingduring the session. The interviews were done in a private room, and both myself andthe participants were fed the audio output through headphones. The same instrumentswere made available in each session and the initial audio setup was the same for eachsession. The scope of functions which were applicable to Extempore patterns wasconstrained and the same functions were made available in all sessions. This was thecase in all sessions expect for 1. In the first interview for participant 1, they were notintroduces to the randomization and reverse/rotate functions in Extempore, so theresults collected from this participant will be considered keeping this in mind. Thesession was conducted on a Area 51 Alienware machine in a Windows 10 environment.The editor used was the VS Code 2017.

4.4 Questions

Below are is a set of questions which served as a guide to the interview. The questionsserved as anchor points for what was other a free-form discussion. It was important

Page 39: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§4.4 Questions 27

to keep the discussion free-form in order to stay open to emergent concepts I may nothave hypothesised.

• Would use it in a live performance?

• Did you have a goal when you started? How did it change as you went?

• How much experimentation was there with this agent compared to the last one?

• Were there any pain points?

• Why did you decide to stop at the time that you did? Did you want to stopsooner?

• Did the agent’s operations impede your performance in any way (either agent)?

• Overall, did the agent’s operations enhance your performance (both agents)?

• Were there any moments where the agent’s changes gave you new ideas to try?Did you implement these ideas? Why/Why not?

• Were there occasions where the agent’s changes made you stop changing/writingcode for a while? Why?

• Which agent control setting did you enjoy making music with the most? Why?

• What did you like and dislike about each version of the agent?

• In all interactions you had with the agent, you had control over when the agentmade changes. Would you like to see a version of this software where you do nothave control over when the agent makes changes?

Page 40: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

28 Experimental Methodology

Page 41: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Chapter 5

Results

5.1 Overview

In this section we will analyse the transcripts from the experiments. These transcriptsinclude each participant’s answers from the interview and notes collected from theiracticity during the music-making session. This analysis will be conducted using agrounded theory-based approach. We will begin by describing the methods used toanalyse our data. Since the purpose of this study is to investigate which control settingis the most conducive to creative stimulation, the first part of our discussion will focuson the control settings’ affects on the participants’ creative process. Finally, we willaddress any comments the participants made about the underlying functionality whichwas common between both versions of the agent.

5.2 Approach to Analysis

As mentioned above, the approach used for analysis is based on grounded theory[Strauss and Corbin, 1994]. In grounded theory, transcripts of interviews are analysedduring the data collection process, so theories can be formed over time and the inter-view process can be tailored to investigate the emerging theories. There are 4 differentstages of information capture in grounded theory. First, the transcripts are analysedfor codes. This is where the key ideas from the raw transcript are extracted. This isusually done line-by-line. After this stage, the codes from each transcript are comparedand the analyst groups codes based on commonalities between them. These groupsare referred to as concepts. In the third stage, memos are compiled for each of theidentified concepts. Memos serve as a log of the analysts theory developing. It is atthis stage that the analyst introduces their judgement and begins developing theories.In the final stage, the analyst will conduct theoretical sampling. This phase involvesidentifying areas which need further investigation in the context of the newly formedtheories from the previous phase. The interviews are then modified to accomodatefurther investigation in these areas. These processes are repeated until the interviewsbecome saturated, meaning no new information is revealed form the interviews. At

29

Page 42: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

30 Results

this stage, the theory is put forward as a set of concepts which are related. The theoryis considerd to be dependent on context and never finalised.

The approach taken in this report is similar to grounded theory in that the transcriptwere analysed for codes and from this memos were compiled. However, this processwas not repeated over several iterations during the the interview process. Due totime constraints, only one round of interviews for each participant were conducted.The analysis was conducted after 4 participants had completed all the sessions, whilethe other two participants were mid way through their set of sessions. However, theinterview questions and the structure of the music-making session was not altered asa results of the analysis of the first 4 participants data.

5.3 Evaluation

Since there is a possibility that the participans’ approach to the music making-sessionwithout an agent could affect their experience with either version of the agent, wewill begin by looking at the concepts collected from the ’No-Agent’ experiments first.There were also some codes which were not common between participants, but we willdiscuss them here because, in some cases, they prove to be insightful.

No Agent SessionIn the ’No Agent’ session, when asked about their approach, all participants stated

they had a goal or a plan for the session when they began. These ranged from "fo-cussing the piece around a minor jazz chord progression" or to giving the session "aclear beginning, middle, end structure" or "to build from the drum and piano pat-terns from the examples in the tutorial". From the transcriptions made during themusic-making part of the experiment, it was clear that the approaches taken by theparticipants can fall into two categories. The first is a very structured approach ex-hibited by p1, p3, p4 and p6, in which the participants’ seemed to treat the task likethey would a composition. Participant p4 even stated he "treated it like a composi-tion and wanted there to be a shape to the piece". A large amount of the changeswere carefully calculated by these participants before they were incorporated into thepiece. This is indicated by the duration of time between instances where the codewas changed. The participants who had this structured approach would often referto the midi-pitch chart inbetween changes. Three of these participants, p1, p3 andp6, would also update a newly evaluated pattern less frequently over the course ofthe session. When asked about what the participants would do when they felt themusic was plateauing, participants p1, p3, p4 and p6 said they would refer to musictheory to draw on ideas and would use the midi-pitch chart to assist them. This wasnot necessarily due to the participants fluency in music theory or lack thereof. Theparticipants who used a structured approach in the first session had varying levels ofunderstanding of music theory. Having said this, alot of these participants did someexperimentation in the very beginning of the session, but after this they focussed on

Page 43: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§5.3 Evaluation 31

one idea and experimented less.

Participants p2 and p5 were heavily experimental in their approach. Though bothparticipants were still operating within a predetermined goal or structure, they madechanges often and rarely looked at the mdi-pitch chart when they made changes.Though thats not to say their changes weren’t driven form music theory. What distin-guished these participants from the others is that the time elapsed between subsequentchanges to the code was shorter, which could imply that they did not spend timefully developing an idea before implementing it. They used a lot of the techniquesintroduced in the turorial. Both of these participants were musicians and have been forseveral years. One of these participants was the only one to have some experience withthe Extempore environment, though was only familiar with the basics. The other par-ticipant has experience doing live visuals alongside a live coding musician, thereforewas familiar with how such a session would generally be approached.

Agent A (Suggestive) Session We will now look at what effects Agent A, thesuggestive agent, had on the participants. 2 of the 6 participants, p3 and p5, prefferedmaking music with the suggestive agent. Both of these participants were introducedto the suggestive agent last. Both participants said they enjoyed this version morebecause they enjoyed being able to see the change and modify them. Of the twoparticipants who like the suggestive agent, one of the participants’ stated that 70% ofthe time, they would lean into the changes the agent suggested. Some concepts whichoccured across multiple participants with this agent was that 4 participants would cyclethrough changes faster without evaluating the code inbetween requests. This could bebecause they are seeing functions which they were familiar with, ones which have beenintroduced in the tutorial, so they could be searching for new functions or waiting forthe agent to make a change which is more dramatic. 4 participants found that theycould recognize specific moments in which the agent would suggest a change and asa result of accepting that change, the participant went on to change other parts of themusic. For example, One participant received a pattern which created a poly-rhythmin his piece. The participant already had a pattern palying where the pitch list had 4pitches and the list was played over a durection of 4 beats and the agent brought inanother pattern which had 3 pitches playing over 4 beats. This lead to an interestingrhythm where the pitches in the two lists would move in and out of phase in termsof the times they were played at. The participant used a similar technique later in thesession. Though he is not sure, he thinks "the agent might have influenced [him]" touse poly-rhythms in later parts of the piece.

One participant, who didn’t prefer the suggestive agent felt that there wasn’t amoment in the session where a change generated by the agent caused the participantto change other parts of the song or give him new ideas. He attributes it to the fact thatthe changes the agent made seemed quite confined to the original pattern he wouldsend to the agent. He would have liked to see the agent make more dramatic changes.2 participants found that they didn’t like the experience of using the suggestive agent

Page 44: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

32 Results

because the text was being replaced directly in the editor. They felt this was a bitdisorienting and felt they lost track of which components were introduced by the agentand which were their own. Although, one said that it could be because he was usingit in an impulsive way and was requesting changes often. One of the participantswho preffered the suggestive agent and was introduced assertive agent said it wouldbe useful to have the change be visible possibly in a comment. But after using thesuggestive agent, he felt that having the change appear in the line where the old patternwas made it more efficient to modify and evaluate changes. One of the participantswho didn’t like the suggestive agent encountered a pattern which the agent producedwhich had performed a strict inversion of the list he was using. This meant that the listproduced by the agent had a major tone while the other components of the participant’spiece was minor. The participant did not like that the agent was capable of makingsuch a change and as a result, was more reluctant to use the agent, however, the agentwas used a few more times in his session after this. All participants were using theagent for ideageneration. Some of the participants liked the fact that the agent didn’tgenerate anything which was too different from what was already playing.

Agent B (Assertive) Session

4 out of the 6 participants, p1, p2, p4 and p6, preffered using the assertive agentto the suggestive agent. One common concept which occurred across the majority ofthe participants is that they listened to the pattern for longer after the agent wouldintroduce a change and before they requested another change. This listening timewas much larger when compared to the time spent listening before a new request orchange was made with the suggestive agent. Possibly as a result, all participants foundmoments where the agent made a change which caused the participant to change thedirection of their piece or caused them to make changes in other parts of the code.However, p4 said that it was rare that an agents change would cause him to makesignificant changes in other parts of the code. This was the same participant whostated he didn’t make any changes which were inspired or influenced by the agent’schange. He said the reason this happend with Agent B was again that the agent wasn’tdeviating too much from what was currently playing. One participant, p5 said thatthe agent generated changes which the participant didn’t even think of doing. Herehe was referring to a moment where the agent generated a very long list of pitcheswith repetition and it created a trill which oscillated between two pitches. There wereoccasions where such long lists like this appeared when several participants were usingthe suggestive agent, but often they would dismiss this suggestion and request anotherone without evaluating it. Its worth noting that of the 4 participants who preffered theassertive agent, 3 of them, p1, p4 and p6, were introduces to Agent B last, while bothparticipants who preferred agent A, were introduced to it last. Since it was expectedthat a bias would come from having more experience with live coding and patternswhen the participants encounter their second agent, if a participant reacted positivelyto their experience with their second agent, they were asked if the experience they

Page 45: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§5.3 Evaluation 33

gained from live coding in the previous session affected enjoyment in the last session.Each participant stated that, though they were more comfortable live coding by theirfinal session, that didn’t greatly factor in their decision for the agent they preffered.Most participants were not uncomfortable being unable to see the change the agentmade. They say this is because they had the option of stopping the pattern. Oneparticipant, p1, in particular started his previous sessions with a goal or an idea ofhow to approach the session. He was first introduced to the suggestive agent. Hestated that he had less of a goal when the session began with agent B. It is evidentwhen looking over the notes made during the session that there were 3 or 4 distinctmoments where the agent would generate a change and the mood of the piece wouldchange as a result. The participant he "leaned into the change" suggested by the agentand changed other parts of the piece accordingly. The pariticpant did not excessivelyrequest changes from the agent, but seemed to do so when the music was plateauing.

Underlying Agent: Comments on the suggestions made

In this section we will look at some common comments made by participants whichwere independent of the version of the agent they were using. We will address anyweaknesses or imlementation problems which were found to have straightforwardsolutions in this section. For those comments which require more complex solutions,we will address possible solutions to such comments in the next section; conclusionand future work. One general comment which was made over the majority of theparticipants is that the agent’s changes were not very sophisticated. Most peoplerecognised there were a few functions with the agent was capable of applying to thepattern. They also suggested that it would be better if the agent had some knowledgeof the current key of the piece. This came from situations where the agent would makestrict inversions rather than tonal inversions, which would preserve the key it wasin. Most participants found they liked the fact that most of the change the agent wasmaking weren’t too different to the other components of the piece. However, there wasone participant who would have liked to see more dramatic changes in the changesthe agent was making. They stated that they felt like the agent’s changes were stayingaround the same key centre and would like it to explore a little bit more. There weretendencies for the agent to remove all pitches or replace them with rests. This becameparticularly problematic as the agent would continue to apply transformations to thelist of single rests leading to an eventual generation of very long lists of rests only. Thiscan be easily fixed by adjusting the probabilities of the Markov Chain. Currently theprobabilities of this matrix are static. An extenison to this could be to have dynamicprobabilities. For example, if a rest was added, a thinning function was applied. Thethinning function could then update its probability array; reducing the probabilityof another thining function being applied and increasing the probabilities of otherfunctions.

Another comment was made by two participants. Both participants were intro-duced to the suggestive agent in their second session. Both of them mentioned, after

Page 46: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

34 Results

their first session with the assertive agent, that they would like to have the changethe agent produced to appear in text form. When asked how they would like thetext to appear in the editor, one participant said it would be useful to have it appearas a comment under the current line, so the original pattern is unaltered. The otherparticipant wasn’t sure how best to have the modifyed pattern appear in the editor butthought that having it change the current line wouldn’t be ideal. One participant alsosuggested that it would be helpful to have a quick way to revert to the original pattern.The current functionality only allows them to use ctr + z to move backwards past allsuggested changes. All these comments are well received and we will introduce a fewpontential solutions to these in the future work section.

Page 47: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Chapter 6

Conclusion and Future Work

6.1 Conclusion

The results indicate that the participants were more creatively stimulated while usingthe assertive agent where the participants could not see the change the agent wasmaking. This could be due to the several reasons. Some participants were more com-fortable interpreting the agent’s modification when they hear the change as opposedto when they see it. However, this is unlikely to be the exact reason as participants hadthe option to evaluate the change made by the suggestive agent in order to hear it also.I attribute the participant’s preference to the fact that, with the suggestive agent, themusician has control over when the pattern is evaluated and seeing the agent’s changecan lead the participant to make premature judement of how the change will soundin the context of the entire piece. It could also be due to the fact that the participantshave to make more of an effort to stop the pattern when using the assertive agent,so they are more restricted and therefore more likely to consider working with theagent’s suggestion. However, the preference over the level of control is highly personaland with such a small sample size, we cannot definitively conclude that one versionof the agent is better than the other. Having said that, this study has brought to lightsome important considerations when tailoring the level of control between agent andmusician.

6.2 Future Work

There are several extension I have identified for this project. If a similar investigationis to continue which explores the best way to distribute control between the agentand musician, it would be beneficial to develop a richer functionality for the agent.Additionaly, since the way the agent replaces the text in the editor may not be the mostconvenient, it may have added a negative bias agains the suggestive agent. Participantsmay have found their experience with the suggestive agent less pleasant because of theauxililary feature of having the agent’s suggestion replace an existing pattern in theireditor. As mentioned in the results section, the best way to have the suggestive agent

35

Page 48: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

36 Conclusion and Future Work

present changes within the live coder’s editor is an open question. This is a possibledirection for an extension to this project. Striking the balance between a method whichdoesn’t clutter the workspace of the musician while still making the pattern accessibleand easily modifiable is difficult. We propose a solution which has the agent echothe change in a popup menu near the cursor, much like an Intellisence popup. Thiswould allow the live coder to quickly assess the agent’s suggestion without having itreplace anything in the coder’s editor. They could also cycle through changes withinthis popup.

Page 49: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Bibliography

Ames, C., 1989. The markov process as a compositional model: A survey and tutorial.Leonardo, 22, 2 (1989), 175–87. (cited on pages 8 and 15)

Biasutti, M. and Frezza, L., 2009. Dimensions of music improvisation. CreativityResearch Journal, 21, 2 (2009), 232âAS242. (cited on page 5)

Boden, M., 2004. The creative mind: Myths and mechanisms. Music Perception, (2004).(cited on page 5)

Brown, A. R.; Gifford, T.; and Voltz, B., 2017. Stimulating creative partnershipsin human-agent musical interaction. Comput. Entertain., 14, 2 (Jan. 2017), 5:1–5:17.doi:10.1145/2991146. (cited on page 7)

Collins, N., 2010. Contrary motion : An oppositional interactive music system. InProceedings of the International Conference on New Interfaces for Musical Expression, 125–129. Sydney, Australia. http://www.nime.org/proceedings/2010/nime2010_125.pdf. (citedon page 9)

Csikszentmihalyi, M., 1998. Society, culture, and person: a systems view of creativity.In R.J. Sternberg, editor, The Nature of Creativity, (1998), 325âAS339. (cited on page 5)

Haught, C., 2015. The role of constraints in creative sentence production. CreativityResearch Journal, 27 (2015), 160–166. doi:10.1080/10400419.2015.1030308. (cited onpage 7)

ixi software, 2015. ixi lang: Live Coding Experimental music. www.ixi-software.net.(cited on page 9)

Johnson-Laird, P. N., 2002. How jazz musicians improvise. Music Perception, 19, 3(2002), 415âAS442. (cited on page 5)

Jordanous, A. and Keller, B., 2012. What makes musical improvisation creative?Journal of Interdisciplinary Music Studies, 6, 2 (Jun. 2012), 151–175. doi:10.4407/jims.2014.02.003. (cited on pages 1 and 5)

Kleinmintz, O.; Goldstein, P.; Mayseless, N.; Abecasis, D.; and Shamay-Tsoory,S., 2014. Expertise in musical improvisation and creativity: the mediation of ideaevaluation. PLoS One, 9, 7 (2014). doi:doi:10.1371/journal.pone.0101568. (cited onpage 5)

37

Page 50: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

38 Bibliography

Lewis, G. E., 2000. Too many notes: Computers, complexity and culture in âAŸvoy-ager.âAZ. Leonardo Music Journal, 10 (2000). www.jstor.org/stable/1513376. (cited onpage 6)

Lubart, T., 2005. How can computers be partners in the creative process: classificationand commentary on the special issue. International Journal of Human-Computer Studies,63, 4 (2005), 365âAS369. doi:doi:10.1371/journal.pone.0101568. (cited on page 6)

Martin, C.; Gardner, H.; Swift, B.; and Martin, M., 2016. Intelligent agentsand networked buttons improve free-improvised ensemble music-making on touch-screens. CHI Conference on Human Factors in Computing Systems, (May 2016), 2295–2306. doi:10.1145/2858036.2858269. (cited on page 7)

McCartney, J., 1996. SuperCollider: Live Coding Experimental music. https://supercollider.github.io/. (cited on page 9)

Roberts, C., 2012. Gibber: Live Coding Experimental music. https://gibber.cc/. (citedon page 9)

Rowe, R., 1992. Machine listening and composing with cypher. Computer Music Journal,16, 1 (1992), 43–63. doi:10.2307/3680494. (cited on page 6)

Sorensen, A. and Gardner, H., 2017. Systems level liveness with extempore. InProceedings of the 2017 ACM SIGPLAN International Symposium on New Ideas, NewParadigms, and Reflections on Programming and Software (Onward! 2017), (2017), 214–228. doi:https://doi.org/10.1145/3133850.3133858. (cited on page 2)

Strauss, A. and Corbin, J., 1994. Grounded theory methodology: An overview.Handbook of qualitative research, 63 (1994), 273–285. (cited on page 29)

Thom, B., 2003. Bob: an interactive improvisational music companion. (2003). doi:10.1145/336595.337510. (cited on page 6)

Xenakis, I., 1971. Formalized Music Thought and Mathematics in Composition. IndianaUniversity Press. (cited on page 8)

Page 51: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Appendix

Appendix

39

Page 52: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Project Description

Improvisational performance in live coding is highly creative programming practice which involves

exploring a large space of sonic possibilities and translating them to code. In an ensemble setting,

communicating and exploring ideas becomes more complex.

This project looks to investigate the role a computational agent can play in an ensemble live coding

performance. It will involve designing and implementing a program which either manages the

performers’ code to ease communication or analyse data from the performance in real-time to

suggest ideas to the performer. These ideas could be related to musical possibilities or

improvements to the performers’ algorithm design.

The design of the agent will be reinforced by maintaining a constant dialogue with the musician

throughout the project.

Learning Objectives

• Interview a group of Live Coding/Laptop musicians to understand how they approach

improvisation.

• Use this understanding to implement a computational agent to aid improvisational live

coding.

• Evaluate the agent’s ability to stimulate creativity through its use in an ensemble

performance.

Ushini Attanayake 21.07.2017

40 Appendix

.1 Appendix A: Project Description

Page 53: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

§.2 Appendix B: Study Contract 41

.2 Appendix B: Study Contract

Page 54: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

42 Appendix

.3 Appendix B: Study Contract

Page 55: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

Software Description

The Artefacts directory has the following structure:

• DIR: Extension

o extension.ts

o package.json

• File: agent.py

• File:

The agent.py file was written by me. This file has the

following classes:

• Markov Chain Agent

• Pattern Parser

• Pattern Tree

The Extempore VS Code extension.ts and package.json files

were modified by me. The following functions were added by

me:

• Class: Agent

• Command: pattern_request

• Command: agentConnect

• Command: initiate_connection

• Command: validAddress

§.4 Appendix C: Software Description 43

.4 Appendix C: Software Description

Page 56: New Agents in Live Coding Improvisation · 2019. 6. 3. · is live coding; a practice where musicians write code to generate music in real-time. In order to do this, I have developed

README.md 5/31/2019

1 / 1

#Agents in Improvisational Live Coding

The computational tool provided in this repository is to be used in the Extempore live coding environment.

The following are the requirements which must be install before using this tool:

PythonpipExtemporeVS Code Editor

Set Up:

1. Install the python dependencies by running pipenv install from the commandline within the'Artefacts' directory

2. Install the text editor extension with code --install-extension vscode-extempore-0.0.9.vsix

Using to the agent:

1. Activate the virtual environment with pipenv shell2. run the agent script with python agent.py. You will see the agent "listening for messages..."3. Open your text editor and, once the Extempore setup code is compiled, enter the command 'Hello

Agent' into the command pallete. This will connect you to localhost on port 5005.4. When promted for the control setting, enter 1 to connect to the suggestive agent or 2 to connect to

the assertive agent.5. To request a change to a new pattern, move your cursor within the scope of the pattern and hit ctr+m.6. To request a change to the pattern which is currently in the agent's memory, hit ctr+x.

44 Appendix

.5 Appendix D: README