Upload
elliot-humphries
View
235
Download
0
Embed Size (px)
DESCRIPTION
Research and Process for a Visualised Score for a song written by Harry Dacre in 1892
Citation preview
DAISYBELL
Research and Process for a Visualised Score for a song written by Harry Dacre in 1892
CONTENTSPROJECT PROPOSAL
ABOUT DAISY BELL
PROJECT RESEARCH : COMPOSITION
PROJECT RESEARCH : SCORES
SCORE CREATION
MIDI GAME CONTROLLER
ANIMATION EXPERIMENTATION
RENDER EXPERIMENTATION
PROCESS
LAYERED SCORE
RENDER ITERATIONS
FINAL RENDERS
POSTER VARIATIONS
S
4
5-6
7-8
9-10
11-13
14
15-16
17-18
19-20
21-22
23-30
31-37
PROJECTPROPOSAL
FIELD OF STUDYTo look at music and it’s visual styling
PROJECT FOCUSHow is music displayed and how can it be visualised in alternative ways in a Graphic style?
CONTEXT“Graphic notation is the representation of music through the use of visual symbols outside the realm of traditional music notation. Graphic notation evolved in the 1950s, and it is often used in combination with traditional music notation. Composers often rely on graphic notation in experimental music, where standard musical notation can be ineffective.” - Wikipedia
Graphic Notation presents almost abstracted information, but fills the gap where Modern Notation (originated in European classical music and is now used by musicians of many different genres throughout the world) falls short.
RATIONALEMusic is incredibly important to me,
and any kind of exploration into it is incredibly interesting and useful to me.
In Graphic terms, Music is always being explored in visual ways, especially now with more generative design projects that use sound to create visuals. While at Transmediale, I saw several projects that mixed audio and video - The Braun Tube Jazz Band and the SenSorSuit
METHODSResearch into alternate musical composition techniquesTo develop a higher understanding and system for producing 12-tone streamsRecording and sequencing testsExploration of visual interpretations of music - notation, generated images
ABOUTDAISY BELLThe reason I chose the song Daisy Bell is beacause of it’s history and it’s simplicity. It was written by English composer Harry Dacre in 1892. In 1962 the IBM 704, with help from physicist John Larry Kelly, Jr, become the first computer to synthesise speech and recreated a rendition of Dacre’s Daisy Bell. Arthur C. Clarke’s chance meeting with the IBM 704 led him to include Daisy Bell in his novel and screenplay 2001: A Space Odyssey, where it is sung by HAL 9000 as his is shut down by Dr. David Bowman. A scene that exaggurated the loneliness and hopelessness of the lyrics as HAL’s voice gets slower and lower.
From Wikipedia (http://en.wikipedia.org/wiki/Daisy_Bell)“As David Ewen writes in American Popular Songs:When Dacre, an English popular composer, first came to the United States, he brought with him a bicycle, for which he was charged duty. His friend (songwriter William Jerome) remarked lightly: ‘It’s lucky you didn’t bring a bicycle built for two, otherwise you’d have to pay double duty.’ Dacre was so taken with the phrase ‘bicycle
built for two’ that he decided to use it in a song. That song, Daisy Bell, first became successful in a London music hall, in a performance by Katie Lawrence. Tony Pastor was the first one to sing it in the United States. Its success in America began when Jennie Lindsay brought down the house with it at the Atlantic Gardens on the Bowery early in 1892.”
PROJECTRESEARCH:
COMPOSITIONOne of my first areas of reseach was in alternative composition methods. I began looking at the relationship between maths and music and if there were ways in which formula or rules could be used to dictate the structure, tone and speed of a piece of music.
SERIALISMSerialism involves taking 12 notes, and using “a series of values to manipulate different musical elements.” It evolved from Arnold Schoenburgs Twelve-Tone Technique that “orders the 12 notes of the chromatic scale, forming a row or series and providing a unifying basis for a composition’s melody, harmony, structural progressions, and variations.”
Looking into Serialism I was attempting to create music that was controlled entirely by numbers and algorithms, which could in turn give me a basic notation that could give you the process needed to achieve the music (like a recipie) rather than a traditionaly notated score.
A test Serial Sequence
FIBIONACCI NUMBERSI also began looking at the Fibionacci Numbers and their relevence in music because of it’s seemingly random sequence of numbers:
0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, 610, 987
It turns out that Fibionacci Numbers and phi were prevalent in music, with relation being almost everywhere.
“A scale is composed of 8 notes, of which the 5th and 3rd notes create the basic foundation of all chords, and are based on whole tone which is 2 steps from the root tone, that is the 1st note of the scale. Note too how the piano keyboard scale of C to C above of 13 keys has 8 white keys and 5 black keys, split into groups of 3 and 2. While some might “note” that there are only 12 “notes” in the scale, if you don’t have a root and octave, a start and an end, you have no means of calculating the gradations in between, so this 13th note as the octave is essential to computing the frequencies of the other notes. The word “octave” comes from the Latin
word for 8, referring to the eight whole tones of the complete musical scale, which in the key of C are C-D-E-F-G-A-B-C. In a scale, the dominant note is the 5th note of the major scale, which is also the 8th note of all 13 notes that comprise the octave. This provides an added instance of Fibonacci numbers in key musical relationships. Interestingly, 8/13 is .61538, which approximates phi. What’s more, the typical three chord song in the key of A is made up of A, its Fibonacci & phi partner E, and D, to which A bears the same relationship as E does to A. This is analogous to the “A is to B as B is to C” basis for the golden section, or in this case “D is to A as A is to E.”
PROJECTRESEARCH:
SCORESWhen starting out, I began by researching a number of different musical scores. The most interesting pieces I found came from a period of avant-garde and experimental music from the 1950s-60s where composers such as John Cage (score below is a work for eight tracks of ¼ inch magnetic tape. The score is a pattern for the cutting and splicing of the sounds recorded on the tape) experimented not only with alternative ways of making music, but also new ways to display it in a graphical format.
Another notational format I came across was Leban (right) which “is a way and language for interpreting, describing, visualizing and notating all ways of human movement. Created by Rudolf Laban, LMA draws on his theories of effort and shape to describe, interpret and document human movement. ... it is one of the most widely used systems of human movement analysis.”.
An attempt at trying to use a barcode to determine rhythm / structure and the notes used, with
the barcode numbers being used to determine notes and the width of the corresponding bar for length.
SCORECREATION
After researching different scores, I decided to attempt my own, without having any music to work with. I started by creating structures that I could use to replace normal notation elements, simplifying them to basic lines
At this point I was mainly dealing with rhythms, with no markings to represent pitch
1 note 2 notes
Held note 1/2 note
MIDIGAME
CONTROLLERAt one point during my research I began looking at the possiblity of using a Xbox360 controller as a input device for creating and manipulating music.
Using a program called Rejoice I was able to map individual button presses to MIDI (Musical Instrument Digital Interface) signals, allowing button presses to trigger notes or control properties like volume and filters.
As I was doing this I started looking at ways that I could represent music composed with the controller so that
I could be replicated. Button presses were fairly easy to represent with colour coding or icons but any movement on the two joysticks became difficult because of the area they cover. I found that if Time was a dimension that ran from left to right (like in traditional notation) then I could only show the joystick moving along the Y axis because any movement along the X axis would be going against Time. Below are some notational examples.On the right is my button layout and a cleaned up version.
Above is the solution to the Time-Axis problem. By rendering out the information in a 3D space I was able to include the X and Y axis of the joystick and have them travel along a third Z axis.
This also allowed me to create a 3D map of movement, whereby the movement of the joystick would create a 3D spline. I would then go on to use that information to create a 3D structure by joining and filling in the movements, thus creating a unique visualisation.
ANIMATIONEXPERIMENTATION
I also started looking into rendering out sound as motion forms and put together a brief test featuring a series of bars that react to sound, with a camera panning around them. The test was fairly easy to put together from tutorials online and the results were ok, but at this point I didn’t have any generated music to work with so decided not to persue any further.
Below are two screenshots from the test and right is a view from inside Cinema4D of the circular model.
RENDEREXPERIMENTATION
The rendering process started as an experiment to try and create shapes inside of Cinema4D to represent music.
I started by exploring and creating exploded and fractured forms that could represent the abstract nature of sound. It was these experiments that let me onto creating the final 3D structures.
The final experiments here did not have enough structure or contain enough ‘information’ that could be interpreted.
1. The first experiment, created by extruding a simple spline, then eploding that extrusion
2. The X,Y,Z axis example spline which has been filled in and joinded together. A cel renderer has been applied so that it’s lines are clearer.
3. An audio file containing music is rendered in 3D on the left and is then exploded and abstracted on the right.
4. Exploded circualr structure with outlined counterpart.
1.
2.
3.
4.
PROCESSOnce I decided on using part of Daisy Bell I set about turning it into a 3D rendering. I developed a process as I worked through the several different versions.
Below are two screenshots from Cinema4D. The first one showing the paths created on my first attempt, with only the singular main melody being rendered.
The second is during my second attempt where I added in the backing parts and layed them against eachother .
The third is the final model, with everything condenced.
Determine section of a song \/
Locate notation / MIDI score version of the song
\/Import the MIDI file into Sonar (or any other Digital Audio Workstation that allows for MIDI editing) / Recreate score in Piano Roll format (as shown above)
\/Take a screencap of the complete piano roll of each part and import into Illustrator
\/Clean up the boxes and remove any extraneous information
\/Save open into Photoshop, select all the boxes and turn into paths
\/Go to File > Export > Paths to Illustrator
\/Import the ‘Export’ file into Cinema4D
\/Align all parts within a 3D space
\/Extrude, texture and render shapes
\/Create splines that join all the ‘notes’ to form structures around notes
- Daisy Bell - Chorus
- Found online fairly easily. Notation and MIDI files.
- Made sure that every section was within the same scale so everything would match up in Illustrator
- These were then textured with transparent textures
LAYEREDSC0RE
When converting the score into Piano Roll I decided to split up the octaves so every octave had it’s own section. Each octave contains 12 notes and allows me to layer the notes ontop of each other, but in a 3D space they can be placed alongside each other so that the structure is kept at the same height, with notes being played alongside eachother being the same note, but in different octaves.
The boxes on the right show those octaves.
Mel
od
y lin
e
Bac
king
Cho
rds
Bas
s
oct
ave
7
oct
ave
6
oct
ave
5
oct
ave
4
oct
ave
3
RENDERITERATIONS
The rendering process started as an experiment
This render shows the red ‘scroll bar’ mid way through Daisy Bell. In the animated film I used it in sync with the music as a visual queue to show it’s progression. As the bar moves, the camera follows.
Each black rectangle represents a note, with it’s length representing the note length and it’s position is in relation it’s pitch and the time / beat it is played on.
FINALRENDERS
top view
side view
angled view
POSTERVARIATIONS
I decided to create posters to display as much information as possible on the Daisy Bell renders. Here are the three variations I have developed so far that follow the same basic layout but with a different, large scale image.
The lower half featues a condenced Piano Roll of the score, with written information below (lyrics, history, layout information)
A landscape layout was chosen so that I could fit larger images of the render, because of it’s thin dimensions, as well as the piano roll to accompany. I kept it minimal to fit with the clean and sharp nature of the render, but also to give it an instructional feel.
THEBACKNOTFINISHEDYET