14
Jediah Lankitus University of Cincinnati Introduction: The purpose of 4d Music Box is to provide users a way to control music in a more interactive way than ever before. Users can control an instrument on each hand, drums with their feet, and alternate between major and minor chords using muscle/brain signals. Perhaps the somewhat greater purpose of this project is to establish a means for using the Kinect and Emotiv Epoc. This project forced me to figure out how to store and print the Kinect data, establish a proper distance from the Kinect, and find appropriate ranges of motion for each body part to assign parameter changes. I also learned a means to interpret and transfer data from the Kinect to midi data, allowing full control of any software instrument or parameter that accepts midi. I also found a free program, called MindYourOSCs, which converts all of the Emotiv Control Panel to OSC messages. This is extremely useful, as OSC messages are transferred via the local network, and are always the go to method for developers who design apps that control music via mobile phone. Being aware of this program will be a huge resource for future students who have an idea of how to use the Emotiv. Background: This project primarily used C++, which I am familiar with, but besides that I used all new programs, software, and technologies. I learned how to apply Skeleton Tracking through an example given in the Kinect SDK. I thought it was fascinating how the Kinect can know each of your joint positions so quickly and accurately. I also learned how to access that data and save it to a text file, which is crucial so that it can be used elsewhere. Knowing the range of data that will be outputted for each joint, as well as how to access it allows you to control anything with your body. The possibilities are truly endless once you have the xyz data printed to a text file. I am thrilled that I learned how to trigger midi messages using c++. I wonder if midi could be useful in other DLAB applications, although midi is traditionally used in music. Due to the availability of free virtual midi ports, it can be used as a simple means of transferring data from one program to the other, so long as they both have the ability to send and receive midi. C++ has the appropriate libraries to do this available, as many others do, such as Python. It was a real treat finding out that MindYourOSCs was a free program available via the Emotiv Store. This program neatly packages all the data that is accessed in the Emotiv Control Panel into OSC messages. Pure Data has an object called mrpeach which unpacks these OSC messages, allowing for any parameter to be accessed. OSC messages have all sorts of mobile applications, as the messages are accessible over the network. Most languages have libraries that allow for the developer to unpack OSC messages, which opens so many doors for the Discovery Lab. Perhaps an Arduino that moves via brain waves? The great thing about MindYourOSCs being connected to the Emotiv Control Panel is that the user can easily “train” the Emotiv to recognize a few thought patterns that are easy for that specific user to replicate, and these parameters can be used to control anything! Using the smile recognition to control a Pure Data parameter was quite simple, and as the head tracking on the Emotiv Epoc is excellent, even just that feature would be an awesome feature to integrate into future projects.

Introduction - University of Cincinnatihomepages.uc.edu/~lankitjd/4D_Doc.pdfI also learned a means to interpret and transfer data from the Kinect to midi data, allowing full control

  • Upload
    doantu

  • View
    214

  • Download
    1

Embed Size (px)

Citation preview

Jediah Lankitus University of Cincinnati

Introduction:

The purpose of 4d Music Box is to provide users a way to control music in a more interactive way than ever before. Users can control an instrument on each hand, drums with their feet, and alternate between major and minor chords using muscle/brain signals.

Perhaps the somewhat greater purpose of this project is to establish a means for using the Kinect and Emotiv Epoc. This project forced me to figure out how to store and print the Kinect data, establish a proper distance from the Kinect, and find appropriate ranges of motion for each body part to assign parameter changes. I also learned a means to interpret and transfer data from the Kinect to midi data, allowing full control of any software instrument or parameter that accepts midi. I also found a free program, called MindYourOSCs, which converts all of the Emotiv Control Panel to OSC messages. This is extremely useful, as OSC messages are transferred via the local network, and are always the go to method for developers who design apps that control music via mobile phone. Being aware of this program will be a huge resource for future students who have an idea of how to use the Emotiv.

Background: This project primarily used C++, which I am familiar with, but besides that I used all new programs, software, and technologies. I learned how to apply Skeleton Tracking through an example given in the Kinect SDK. I thought it was fascinating how the Kinect can know each of your joint positions so quickly and accurately. I also learned how to access that data and save it to a text file, which is crucial so that it can be used elsewhere. Knowing the range of data that will be outputted for each joint, as well as how to access it allows you to control anything with your body. The possibilities are truly endless once you have the xyz data printed to a text file. I am thrilled that I learned how to trigger midi messages using c++. I wonder if midi could be useful in other DLAB applications, although midi is traditionally used in music. Due to the availability of free virtual midi ports, it can be used as a simple means of transferring data from one program to the other, so long as they both have the ability to send and receive midi. C++ has the appropriate libraries to do this available, as many others do, such as Python. It was a real treat finding out that MindYourOSCs was a free program available via the Emotiv Store. This program neatly packages all the data that is accessed in the Emotiv Control Panel into OSC messages. Pure Data has an object called mrpeach which unpacks these OSC messages, allowing for any parameter to be accessed. OSC messages have all sorts of mobile applications, as the messages are accessible over the network. Most languages have libraries that allow for the developer to unpack OSC messages, which opens so many doors for the Discovery Lab. Perhaps an Arduino that moves via brain waves? The great thing about MindYourOSCs being connected to the Emotiv Control Panel is that the user can easily “train” the Emotiv to recognize a few thought patterns that are easy for that specific user to replicate, and these parameters can be used to control anything! Using the smile recognition to control a Pure Data parameter was quite simple, and as the head tracking on the Emotiv Epoc is excellent, even just that feature would be an awesome feature to integrate into future projects.

Materials and Methods

Installation / Setup

1. Install Visual Studio 10 or 2013

a. Create a DreamSpark account. It is meant to get expensive software to students for free, thus take advantage of this.

b. Find Visual Studio 10 and download. I did the installation via the secure download manager, which you will be prompted to download if you so choose. I recommend this because it is what I tried and it worked.

c. You may also download Visual Studio 2013. You can download community, which is free and does not even require an account.

d. 2013 Community installation is on the flash drive you will receive.

2. Install the Kinect 1.7 SDK.

a. There should be an included flash drive which has the installation file ready for you. If not, move to the next step. Before you install however, restart.

b. Restart your computer after installing Visual Studio. Make sure that you do not open visual studio until everything else is installed.

c. Make sure the Kinect sensor is not plugged into any of the USB ports on the computer. d. If you have the Kinect for Windows v1.0, v1.5, or v1.6 SDK currently installed, close any

open samples, the Sample Browser, etc. and skip to step 5. Kinect for Windows v1.7 will upgrade the previous version.

e. Remove any other drivers for the Kinect sensor. f. If you have Microsoft Server Speech Platform 10.2 installed, uninstall the Microsoft

Server Speech Platform Runtime and SDK components including both the x86 and x64 bit versions, plus the Microsoft Server Speech Recognition Language - Kinect Language Pack.

g. Go to http://www.microsoft.com/en-us/download/details.aspx?id=36996. h. Click Download, and open once finished. i. Follow the installation steps. It is almost exclusively hitting next and finish, very simple. j. Plug in the Kinect. Make sure it is plugged into a power source. k. Follow the link they provide at the end of installation to download the Kinect Developer

Tools (1.7 as well).

3. Install the Kinect Developer Tools

a. Once again, these should also be on a flash drive you received with the project. If not, move on.

b. Click download after you follow the link at the end of the SDK installation. c. Follow the installation steps.

4. Install Pure Data Extended

a. Locate the pure data install file in the included flash drive. If it is not there, move on to

the next step. b. Go to http://puredata.info/downloads/pd-extended c. Install the latest release. This was tested with Pd-Extended 0.43.4

5. Install Rt-Midi libraries

a. These are located within both the MIDI_Practice2 project and SkeletonBasic project that

you downloaded from OwnCloud. b. Locate the project in Windows explorer for each. c. Open up notepad or Word, and copy the path of both RtMidi.h and RtMidi.cpp into the

notepad or word doc. You will need these later, for each project. Use the respective path when you come to that step later.

d. If they are not present in the OwnCloud project folder, or OwnCloud is down, move on. e. Go to http://www.music.mcgill.ca/~gary/rtmidi/ f. Download the latest version. I tested this with version 2.1 g. To access this, as it is a tar.gz, you need to install 7zip. h. Go to http://www.7-zip.org/download.html and download the latest version. i. Right Click the rtmidi-2.1.0.tar.gz file (name may slightly differ) and open it with 7zip. j. Start a new Visual Studio project, and copy the RtMidi.h and RtMidi.cpp files into your

project folder. k. Include RtMidi.h, with a full path to the file. For example, I used this line:

#include <C:\Users\jlank_000\Documents\Visual Studio 2010\Projects\MIDI_Practice2\RtMidi.h>

l. Your path will be different. There are ways around having to type the full path, but it is

the easiest and most consistent for me. m. You also must change the include line in the RtMidi.h where it includes RtMidi.h. You

just need to add the full path of where you stored RtMidi.cpp. n. You will need to do this again if you are using my Miditest2 project and SkeletonBasic

project.

6. Check for Mr. Peach!

a. You will need a Pure Data object called Mr. Peach, which should be included in the Pure Data extended install.

b. Navigate to the pd folder. It should be in Program Files. c. Navigate to the extra sub folder. d. Look for a folder called mrpeach. e. If it is there, you are good to go! If not, go to http://sourceforge.net/p/pure-

data/svn/17203/tree/trunk/externals/mrpeach/ f. You can download a snapshot of this library from the link. Once downloaded, place in

the extra folder where you first checked for it. g. You can also try a reinstall first. I was first missing it, and upon reinstallation it worked.

7. Install LoopMidi

a. Once again, the install file should be in the OwnCloud folder. b. Go here: http://www.tobias-erichsen.de/software/loopmidi.html c. Click the download link, and follow the install instructions. I recommend creating a

shortcut to it on the desktop. d. Before you run the MusicBox, always open LoopMidi and make sure there is at least one

port. You create ports with the + button. In my MusicBox code, there is a line where the virtual midi port is opened. It looks like this: midiout->openPort(2);

e. I use two, because I have an additional port I use for a separate application, and the Microsoft Wavetable synth. I would guess that the Microsoft Wavetable synth is standard on Windows machines, so I would guess that the port you will use is 1.

f. midiout->openPort(1); g. Included in the project folder is a simpler example code which prints the available ports.

This is the optimal way to make sure your ports are working, and you are using the correct port. Once again, you will have to change the include statements for RtMidi.h and RtMidi.cpp to include the full path to those files for it to work. I have my full path in the code, which will not work for you.

8. Install the Emotiv Control Panel + MindYourOSCs

a. Go to the Emotiv store: https://emotiv.com/store/app/ b. You will need to make an account. It is free and easy. c. Download these two applications: Emotiv Control Panel, MindYourOSCs.

9. Run my Code!

a. My code is hosted on OwnCloud. Navigate to the 4dMusicBox folder, and download the

project folder. b. Start LoopMidi, and add at least one port with the + button. c. Open the MIDI_Practice2 Visual Studio project. d. In MIDI_Practice2.cpp, change the “include RtMidi.h” header to include the entire path

to the RtMidi.h file. e. Do this again in RtMidi.cpp. f. Run the program. g. Take note as to how many ports are available, and which port number is the LoopMidi

port you want to use. h. Record the name and number of the port you wish to use. i. Under the comment “Open Loop Midi Port” you will find this line:

i. midiout->openPort(2); j. Change the 2 to whatever number you recorded. k. Open Pure Data, and then click File -> Open -> and select Kinect.pd from wherever you

downloaded it to. l. Click Media -> standard m. Click Media -> MIDI settings. n. Under input device 1, choose the port you recorded. o. Hit ctr + / or click media -> DSP On.

p. Run the Visual Studio program again. q. Check the Pure Data patch. If the first number underneath notein has changed to 92,

you are good to go. If not, something went wrong along the way. Try to carefully follow the steps again.

r. Now, in Visual Studio, open the SkeletonBasic project. s. In both Skeleton.cpp and RtMidi.cpp, change the “include RtMidi.h” so that it includes

the whole path of the RtMidi.h file. It is within the project folder you downloaded from OwnCloud, and you should have it recorded in notepad or word.

t. Hit ctr + F and search for the word “openPort”. u. Change the argument from 2, to whatever number the desired port is. v. Set up the Emotiv headset (see the Resource List at the bottom of this). Open the

Emotiv Control Panel, ensure good signal, then open Mind Your OSCs. w. On the left side, make sure Emotiv Control Panel is selected from the drop down menu,

then hit connect. x. On the right side, hit connect again, under the OSC section. y. Build and run the Visual Studio project. z. Stand 100 inches away, with the Kinect 29 inches off the ground (most DLAB tables are

about this height). aa. Move your left and right hands up and down, stomp your feet, and smile!!

10. For your own project: Install the skeleton tracking example

a. Open the Toolkit Browser. For me, it was located in Program Files/Microsoft

SDKs/Kinect/Developer Toolkit v1.7/Tools/Toolkit Browser b. Find the skeleton tracking example, and click install. Choose an appropriate folder. c. Open the project in Visual Studio, feel free to play around.

Explanation of Creation

1. The base of this project was the Skeleton Tracking example provided with the Kinect Developer Tools.

2. The first step is to figure out how to access the skeleton tracking data, as well as how to store the said data.

3. I found that the data was being accessed in the DrawSkeleton function, due to the fact that you need the data to be able to draw it.

4. The data is found in a variable called skel. Skel has a function .SkeletonPositions[ ] which allows the user to pass in a number that correlates to a specific joint.

5. A number between 0 and 19 must be passed into SkeletonPositions[]. Each number corresponds to the following joints:

0: Hip Center 1: Spine 2: Center Shoulder 3: Head 4: Left Shoulder 5: Left Elbow

6: Left Wrist

7: Left Hand 8: Right Shoulder 9: Right Elbow 10: Right Wrist

11: Right Hand 12: Left Hip 13: Left Knee 14: Left Ankle

15: Left Foot 16: Right Hip 17: Right Knee 18: Right Ankle

19: Right Foot

6. I used:

a. skel.SkeletonPositions[7]; (Left Hand)

i. This is the left hand. It controls the bass note. ii. I saved this data to a variable called lefthand.

iii. lefthand.x is the data used to determine which note of the c scale is played. iv. lefthand.y is the data used to determine the volume of the bass. v. lefthand.z controls a filter on the bass note.

Ranges:

Note Condition Midi Message

C1 lefthand.x >= -.8000 && lefthand.x <= -.7375 31

D1 lefthand.x > -.7375 && lefthand.x <= -.6750 33

E1 lefthand.x > -.6750 && lefthand.x <= -.6125 35

F1 lefthand.x > -.6125 && lefthand.x <= -.5500 36

G1 lefthand.x > -.5500 && lefthand.x <= -.4875 38

A1 lefthand.x > -.4875 && lefthand.x <= -.4250 40

B1 lefthand.x > -.4250 && lefthand.x <= -.3625 42

C2 lefthand.x > -.3625 && lefthand.x <= -.3000 43

Volume Condition Midi Message

No Volume (0/90)

lefthand.y < 0 0

Lowest (2/90

lefthand.y > 0 && lefthand.y <= .1

2

Low (3/90)

lefthand.y > .1 && lefthand.y <= .2

3

Medium (4/90)

lefthand.y > .2 && lefthand.y <= .3 4

Medium (5/90)

lefthand.y > .3 && lefthand.y <= .4 5

Medium (6/90)

lefthand.y > .4 && lefthand.y <= .5

6

High (7/90)

lefthand.y > .5 && lefthand.y <= .6 7

Highest (8/90)

lefthand.y > .6 && lefthand.y <= .7 8

Filter Threshold

Condition Midi Message

285 lefthand.z > 2.6 21

300 lefthand.z <= 2.6 && lefthand.z > 2.5 22

475 lefthand.z <= 2.5 && lefthand.z > 2.4 23

570 lefthand.z <= 2.4 && lefthand.z > 2.3 24

665 lefthand.z <= 2.3 && lefthand.z > 2.2 25

760 lefthand.z <= 2.2 && lefthand.z > 2.1 26

885 lefthand.z <= 2.1 && lefthand.z > 2.0 27

950 lefthand.z <= 2.0 && lefthand.z > 1.9 28

1045 lefthand.z <= 1.9 29

b. skel.SkeletonPositions[8]; (Right Hand)

i. This is the right hand. It controls the higher pitched chord. ii. I saved this data to a variable called righthand.

iii. righthand.x is the data used to determine which chord is played, starting with C4 and ending with C5.

iv. righthand.y is the data used to determine the volume of the chord. v. By default, the chord will be minor. When the Epoc is in use, when a user smiles,

the chord will become major.

Ranges:

Note Condition Midi Message

C1 righthand.x > .1 && righthand.x <= .11625 60

D1 righthand.x > .1325 && righthand.x <= .14875 62

E1 righthand.x > .14875 && righthand.x <= .16125 64

F1 righthand.x > .16125 && righthand.x <= .17375 65

G1 righthand.x > .17375 && righthand.x <= .18625 67

A1 righthand.x > .18625 && righthand.x <= .19875 69

B1 righthand.x > .19875 && righthand.x <= .21125 71

C2 righthand.x > .21125 && righthand.x <= .22375 72

Volume Condition Midi Message

No Volume (0/90) righthand.y < .35 90

Lowest (2/90) righthand.y > .35 && righthand.y <= .37 91

Low (3/90) righthand.y > .37 && righthand.y <= .39 92

Medium (4/90) righthand.y > .39 && righthand.y <= .41 94

Medium (5/90) righthand.y > .41 && righthand.y <= .43 95

Medium (6/90) righthand.y > .43 && righthand.y <= .45 96

High (7/90) righthand.y > .45 && righthand.y <= .47 97

Highest (8/90) righthand.y > .47 && righthand.y <= .49 98

c. skel.SkeletonPositions[15]; (Left Foot)

i. This is the left foot. It controls the Kick drum. When the user stomps their foot,

it triggers a bang message that triggers the drum sound. ii. I saved the data to a variable called leftfoot.

iii. I only use the Y data of leftfoot. iv. At the end of the DrawSkeleton loop, I created a variable called

previous_left_foot. Because it saves at the end of the loop, it represents the foot position at the previous frame.

v. If the left foot is on the ground, and the previous foot is also on the ground, then a message of 74 is outputted to Pure Data. This simply triggers nothing.

vi. (Else statement) if the left foot is on the ground, and the previous frame was not on the ground, then a message of 75 is triggered. This is what activates the kick drum.

Ranges:

d. skel.SkeletonPositions[19];

i. This is the right foot. It controls the Snare drum. When the user stomps their foot, it triggers a bang message that triggers the drum sound.

ii. I saved the data to a variable called rightfoot. iii. I only use the Y data of rightfoot. iv. At the end of the DrawSkeleton loop, I created a variable called

previous_right_foot. Because it saves at the end of the loop, it represents the foot position at the previous frame.

v. If the right foot is on the ground, and the previous foot is also on the ground, it simply triggers nothing.

vi. (Else statement) if the right foot is on the ground, and the previous frame was not on the ground, then a message of 77 is triggered. This is what activates the snare drum.

Bang Condition Midi Message

Off If leftfoot.y <= -.5 And if previous_left_foot <= -.5

74

On If leftfoot.y <=-.5 And if previous_left_foot is not <=-.5

75

Bang Condition Midi Message

Off If rightfoot.y <= -.5 And if previous_right_foot <= -.5

None

On If rightfoot.y <=-.5 And if previous_right_foot is not <=-.5

77

7. Pure Data Explanation

a. I found Pure Data to be fairly simple to understand, as I have never used it before and this project was created in two weeks.

b. However, I was familiar with Max, which uses totally different objects but runs in a very similar manner.

c. I have posted a few basic resources and tutorials to help you out. A more basic introduction is really necessary before diving into the patch I created. Explaining everything in this documentation is way too much, so I will try to show you all the resources I used to learn it.

Several basic objects must be understood:

1. Moses a. This object is the fundamental logic to the program. b. To use Moses, create a new object (control + 1) and type in moses. c. Moses uses one argument, and it is a number. The number you choose is

the number at which any number passed into Moses smaller than that number is sent out the left outlet, greater than that number is sent out the right outlet.

d. It is used to send various numbers, usually in a range of about 10, to various instruments or effect parameters.

e. See http://www.pd-tutorial.com/english/ch02s02.html 2. Notein

a. This takes in the midi data.

3. Dac a. This is the audio output, the final destination for audio to come to your

speaker.

4. Osc~ a. This is a basic cosine wave, used for making synths and is the basis of the

chord tone.

5. Phasor~ a. Basic sawtooth wave, used for making the bass tone.

6. OSC messages in Pure Data, for the Emotiv

a. This is the top right section, labeled “THE BRAAIN”. b. See the references below for help.

7. Bng

a. This is a “bang” object. b. It is the object that is a circle within a square. c. There are two in Kinect.pd, and they trigger the kick and snare drum.

8. Kick and snare drum

a. These are quite complicated. The only thing to know is that I recreated them from patches I found on the internet, and they are saved within the project folder. Try them out on their own to modify and understand.

b. They are triggered by bang objects.

8. Emotiv Explanation

a. I am only using data from the Smile sensor in the Emotiv Control Panel. b. Pure data unpacks the OSC messages, and has access to all of it. c. In Pure data, I use the updreceive object to unpack the OSC messages. It uses the

argument 7400 because that is the port number Mind Your OSCs uses. d. The First step is to unpackOSC the messages. e. Then I used the routeOSC /EXP, which filters the results down to just expressions. f. The best way to understand OSC objects is to use the print object in Pure Data each

step of the way to print the results to the console. g. Next I use routeOSC again, but with the /SMILE argument. h. At this point, Pure Data will print the value smile feeds. i. Smile gives a 0 when the user is not frowning, and a 1 when the user smiles.

However, it also gives decimal numbers for smaller smiles, which we do not want. We want all the decimals to round up.

j. To do this, I take the number smile outputs, and take it to the 0th power. k. That number, which is 0 or 1, is added to the third of the chord controlled by the

user’s right hand. l. This has the effect of changing the default minor chord to major once the user

smiles.

Results: I was pleased with the resulting program I created. The result is a musical instrument a user can play with their body, and forces them to do a weird little dance. I thought the foot activated drums were especially intuitive, and with practice a user can create interesting songs. Perhaps the greater accomplishment is creating a guideline for using the Kinect’s skeleton tracking to control hardware/software parameters, all using free programs. I also provided a way for each joint’s x, y, and z data to be saved to a text document. This allows for future students to quickly throw something together using one of the lab’s Kinects, rather than digging through all the documentation and giving up. Although I am pleased with how quickly this was thrown together, I am disappointed with several aspects. For one, the single message interface is somewhat limiting, as only values up to 100 can be sent in. If multiple ports can be successfully utilized in Pure Data (which should be possible), a wider variety of parameters could be altered without such complicated logic. I am also disappointed that the user must stand in a somewhat specific place. I would love if I could figure out how to recalibrate the logic so that the user can stand wherever they want and play the instrument, or play different instruments when they move closer.

Discussion: This sprint project was awesome, as I learned the basics of the Kinect sensor, OSC messages, and midi messages. All of the code was done in my most comfortable language, c++ which is great so I can continue building projects of this nature. I am not only excited about what I did and learned from the project, but what other Discovery Lab interns can do with the project with more time or knowledge than I. For this reason, I have compiled some future ideas for the project: 1. Ability to switch instruments.

a. I did not have time, but if an intern can find a way to cycle through different

instruments, it would greatly enhance the user experience of the program. Pure Data has ways in which you can create sequencers that cycle through notes every time a bang message is received, so there is no reason the project couldn’t be set up to cycle through instruments from a bang message.

b. For example, one could set it up so that if the user raises his/her hand really high, off to the right, or across his/her body, a bang message will be triggered that switches the instruments.

2. Ability to switch octaves

a. This could work just like the instrument switch, but raise or lower the pitch by an octave.

b. Perhaps raising the user’s hand up far above their head would raise an octave, and below their knees would lower an octave?

3. More Brain controlled parameters

a. I only used the Epoc’s smile detection, but the system is already in place for more effects to be controlled using the user’s mind.

b. I think somewhat weirder sounding effects would be appropriate for the brain, and really “obvious” sounding effects would blow people’s minds more. Examples of these would be modulating

c. Because Mind Your OSCs runs using the Emotiv control panel, users can train it to recognize separate thought patterns. Imagine how shocking a device which played sounds based on separate thought patterns would be if DLAB brought it to OETC! The most crucial part of this addition would be finding methods to help users succeed, as the training and actual thought patterns of the user affect the quality of data and performance

4. Calibration with respect to distance from the Kinect / Different parameters depending on location

a. Dr. Williams expressed an interest in creating new sounds when a user stands closer or further away from the Kinect. This would be an excellent addition to the project.

b. For reasons unknown to me, the right hand y axis has different values than the left hand y axis, even when they are raised the same height.

c. This results in the need to analyze the data for each and separately calibrate. d. If one wanted to calibrate actions based on the user’s distance (z axis) from the

Kinect, my recommendation is to use the z position data from the user’s head. e. This provides a neutral indicator of where the user is in 3d space. f. Also, the user’s head will not move much in the z direction, whereas for hands and

other parts of the body the z axis is an intuitive motion excellent for modifying parameters.

g. For this program, nested if statements for each significant z distance can be set up to decide different instruments based on the z location.

h. It could also be used to find the user’s z position, then adjust the x and y parameters accordingly, so that the program will work no matter where the user stands. This increases functionality, as the user will not need to figure out where to stand.

5. Create a GUI

a. An important next step is to show confirmation that the user is in the right place, or

a visible representation of where each instrument is. b. The programmer can get quite creative, but the main necessity is that the user gets

some kind of visual feedback other than their skeleton so they know they are playing the instrument.

Resource List

Pure Data:

1. http://www.pd-tutorial.com/english/ch02s02.html a. Explains conditional logic in Pure Data.

2. http://en.flossmanuals.net/pure-data/ a. Explains almost everything you need to know. Start here at the beginning, then once

you have a feel for it research what you need. 3. http://en.flossmanuals.net/pure-data/network-data/osc/

a. Explains OSC messages. 4. http://en.flossmanuals.net/pure-data/midi/using-midi/

a. Explains using midi in Pure Data. 5. http://www.instructables.com/id/Beginner-Pure-Data-Tutorial-Basic-Synth/

a. Beginner basic synth tutorial.

Kinect SDK:

1. https://msdn.microsoft.com/en-us/library/jj131025.aspx a. Basic skeleton tracking explanation b. This is the main resource I used. Digging around here is the best and really only

place you need to go.

RtMidi:

1. http://www.music.mcgill.ca/~gary/rtmidi/ a. Download link with examples and explanations.

Emotiv Epoc:

1. https://emotiv.com/store/ a. Emotiv Store. Everything with the Epoc is self-explanatory and easy to get started.

2. https://www.youtube.com/watch?v=OQ8sgKc65i8 a. Explains how to set up the device.