108
Auditory Localisation: Contributions of Sound Location and Semantic Spatial Cues A thesis submitted in fulfilment of the requirements for the degree of Master of Applied Science Submitted by Norikazu Yao B.Sc. (Japan) M.Sc. (Japan) School of Human Movement Studies Queensland University of Technology 2007

Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

Auditory Localisation: Contributions of Sound

Location and Semantic Spatial Cues

A thesis submitted in fulfilment of the requirements for

the degree of

Master of Applied Science

Submitted by

Norikazu Yao

B.Sc. (Japan) M.Sc. (Japan)

School of Human Movement Studies

Queensland University of Technology

2007

Page 2: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

i

Keywords

Auditory localization

Spatial Stroop effect

Stimulus response compatibility

Semantic processing

Information processing

Response selection

Reaction time

Orienting

Page 3: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

ii

Abstract

In open skill sports and other tasks, decision-making can be as important as

physical performance. Whereas many studies have investigated visual perception

there is little research on auditory perception as one aspect of decision making.

Auditory localisation studies have almost exclusively focussed on underlying

processes, such as interaural time difference and interaural level difference. It is

not known, however, whether semantic spatial information contained in the

sound is actually used, and whether it assists pure auditory localisation. The aim

of this study was to investigate the effect on auditory localisation of spatial

semantic information. In Experiment One, this was explored by measuring whole

body orientation to the words “Left”, “Right”, “Back”, “Front” and “Yes”, as

well as a tone, each presented from left right, front and back locations.

Experiment Two explored the effect of the four spatial semantic words presented

either from their matching locations, or from a position rotated 20 degrees

anticlockwise. In both experiments there were two conditions, with subjects

required to face the position indicated by the sound location, or the meaning of

the word. Movements of the head were recorded in three dimensions with a

Polhemus Fastrak system, and were analysed with a custom program. Ten young

adult volunteers participated in each experiment. Reaction time, movement time,

initial rotation direction, rotation direction at peak velocity, and the accuracy of

the final position were the dependent measures. The results confirmed previous

reports of confusions between front and back locations, that is, errors about the

interaural axis. Unlike previous studies, many more back-to-front than front-to-

back errors was made. The experiments provided some evidence for a spatial

Stroop interference effect, that is, an effect on performance of conflicting

Page 4: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

iii

information provided by the irrelevant dimension of the stimulus, but only for

reaction time and initial movement direction, and only in the Word condition.

The results are interpreted using a model of the processes needed to respond to

the stimulus and produce an orienting movement. They suggest that there is an

asymmetric interference effect in which auditory localisation can interfere with

localisation based on semantic content of words, but not the reverse. In addition,

final accuracy was unaffected by any interference, suggesting that these effects

are restricted to the initial stages of response selection.

Page 5: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

iv

Table of Contents

KEYWORDS…………………………………………………………………….i

ABSTRACT…………………………………………………………….……….ii

TABLE OF CONTENTS…………………………………………………..…..iv

LIST OF FIGURES......…………………………………………………..……vii

LIST OF TABLES……………………………………………………………....x

LIST OF APPENDICES……………………………………………………….xi

STATEMENT OF ORIGINAL AUTHORSHIP…………………………….xii

LIST OF ABBREVIATIONS………………………………………………...xiii

ACKNOWLEDGEMENTS…………………………………………………..xiv

CHAPTER 1. INTRODUCTION and LITERATURE REVIEW…………...1

1.1 Introduction ………………………………………………………………...1

1.2 Literature Review…………………………………………………………...4

1.2.1. Auditory Localisation………………………………………………….4

1.2.2. Semantic Localisation…………………………………………………8

CHAPTER 2. EXPERIMENT ONE……….…………………………………17

2.1. Introduction..……………………………………………………………...17

2.1.1. Definition of Terms………………………………………………17

2.1.2. Research Question………………………………………………..18

2.1.3. Hypotheses………………………………………………………..19

2.2. Method………... …………………………………………………………..20

2.2.1. Participants...……………………………………………………..20

2.2.2. Experimental environment……………………………………….20

Page 6: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

v

2.2.3. Apparatus…………………………………………………………22

2.2.4. Stimulus…………………………………………………………..22

2.2.5. Design and dependent variables………………………………….22

2.2.6. Procedure…………………………………………………………25

2.2.7. Analysis and Statistics ..…….……………………………………25

2.3. Results...………..…………………………………………………………..26

2.3.1. Reaction Time…………………………………………………….26

2.3.2. Initial Rotation Direction…………………………………………31

2.3.3. Rotation Direction at Peak Velocity……………………………...32

2.3.4. Movement Time………………………………………………….33

2.3.5. Front-Back Reversals…………………………………………….35

2.3.6. Constant Error…………………………………………………….36

2.3.7. Reliability………………………………………………………...37

2.4. Discussion…………………………………………………………………..38

2.4.1. Spatial Stroop Effect…………………….………………………..38

2.4.2. Front Back Confusion…………………………………………….42

CHAPTER 3. EXPERIMENT TWO...……………………………………….45

3.1. Introduction…..…………………………………………………………...45

3.1.1. Hypotheses………………………………………………………..45

3.2. Method……………………………………………………………………..45

3.2.1. Experimental environment……………………………………….46

3.2.2. Design…………………………………………………………….46

3.2.3. Procedure………………………………………………………....47

3.2.4. Analysis and Statistics……………………………………………47

Page 7: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

vi

3.3. Results...……………………………………………………………………48

3.3.1. Reaction Time…………………………………………………….48

3.3.2. Initial Rotation Direction…………………………………………49

3.3.3. Movement Time………………………………………………….49

3.3.4. Front-Back Reversals…………………………………………….50

3.3.5. Constant Error…………………………………………………….50

3.2.6. Reliability………………………………………………………...53

3.4. Discussion…………………………………...……………….…………….54

Chapter 4. GENERAL DISCUSSION ……………………………………….57

4.1. Overview of Results……………………………………………………….57

4.2. Auditory Information Processing ………………………………………..58

4.3. Limitations of the Study and Future Research………………………….61

4.4. Conclusions………………………………………………………………...62

REFERENCE LIST...…………………………………………………………63

APPENDICES…………………………………………………….……………66

Page 8: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

vii

List of Figures

Figure 1.1 Interaural Time Difference & Interaural Level Difference…….……..6

Figure 1.2. The cone of confusion………………………………………………..7

Figure 1.3. Spatial Stroop task. Congruent stimulus and incongruent

stimulus ……....………...………………………………………...…………….13

Figure 2.1. Left and right congruity between semantic spatial stimulus and

position of the sound source ……….……………………...……………………18

Figure 2.2(a). Layout of apparatus: Overhead view……………………………21

Figure 2.2(b) Layout of apparatus: Lateral view………………………………..21

Figure 2.3 Directional error calculation illustrated for a “Back” target. A and B

depict two possible head orientations relative to the target shown by the

loudspeaker ….……………………………………………………………...…..24

Figure 2.4 Reaction time for each stimulus in the Location

condition……………....…………………………...……………………………27

Figure 2.5 Reaction Time for each stimulus in the Location and Word conditions.

………………………………..…………………………………………………28

Figure 2.6 Reaction time for each location (Left, Right, Front and Back) in the

Location and Word conditions…………………..………………………………29

Figure 2.7 Reaction Time for congruent and incongruent stimulus-location pairs

(“Left” from left and right side, and “Right” coming from left and right

side) …………………………………………………………………………….30

Page 9: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

viii

Figure 2.8 Initial Rotation Direction in the Location and Word

conditions…………………………………………………………………….….32

Figure 2.9 Rotation Direction at Peak Velocity in the Location and Word

conditions..............................................................................................................33

Figure 2.10 Movement Time for each stimulus in the Word and Location

conditions ……………………………………………………………………….34

Figure 2.11 Movement Time for each location in the Word and Location

conditions…………….………………………………………………………….34

Figure 2.12 Constant Error for each stimulus and each location in the Location

and Word conditions…….………………………………………………………36

Figure 2.13 Reliability for each location in the Location and Word conditions

……………………..……………………………………………………………37

Figure 3.1 Layout of apparatus for the experiment two. ………………..…..….47

Figure 3.2 Reaction Time for the four stimuli in the Word condition and for the

four locations in the Location condition……………..………………………….48

Figure 3.3 Initial Rotation Direction for each stimulus for the Non-Rotated and

Rotated conditions……………..………………………………………………..49

Figure 3.4 Movement Time for each stimulus or location in both

conditions………………………………………………………………………..50

Figure 3.5 Constant Error for each stimulus in both the Location and Word

condition…………….…………….…………………………………………….51

Figure 3.6 Constant Error for Non-rotation and Rotation condition in the

Location condition…………………....…………………………………………52

Page 10: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

ix

Figure 3.7 Constant Error for Non-rotation and Rotation condition in the Word

condition………………...………………………………………………………52

Figure 3.8 Reliability of Constant Error in the Location and Word

condition……………………………………………………………………...…53

Figure 4.1 Information processing model for orienting movement……………..59

Page 11: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

x

List of Tables

Table 2.1. Frequency of front-back reversals in the Location condition (number

and percentage of trial in each combination of location and stimulus)……...….35

Table 2.2 Frequency of front-back reversals in the Word condition (number and

percentage of trial in each combination of location and stimulus)……….……..35

Table 3.1 Frequency of front-back reversals in the Non-rotated and rotated

condition (number and percentage of trial in each combination of location and

stimulus)…………………………………………………………..…………….50

Table 4.1 Effects for each variable. ….…………………………………………58

Page 12: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

xi

List of Appendices Appendix A: Informed Consent Form and Participant Information Packages…68

Appendix B: Statistical Analyses (6 stimuli × 4 locations) and Graphs in

Experiment One…………………………………………………………………70

Appendix C: Means and Standard Deviation Tables in Experiment One………73

Appendix D: Statistical Analyses (2 condition × 4 stimuli × 4 locations) in

Experiment One…………………………………………………………………77

Appendix E: Means and Standard Deviation Tables for Yes and Tone Stimuli in

Experiment One………………………………………………………………....82

Appendix F: Means and Standard Deviation Tables in Experiment

Two………………………………………………………….…………………..86

Appendix G: Statistical Analyses (2 condition × 2 rotation × 4 stimulus-location)

in Experiment Two……………………………………………………………..89

Page 13: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

xii

The statement of original authorship

The work contained in this thesis has not been previously submitted to meet

requirements for an award at this or any other higher education institution. To the

best of my knowledge and belief, the thesis contains no material previously

published or written by another person except where due reference in made.

Signature

Date

Page 14: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

xiii

List of Abbreviations

ITD Interaural time difference

ILD Interaural time difference

HRTF The head-related transfer function

S-R Stimulus response

RT Reaction time

IRD Initial rotation direction

RDPV Rotation direction at peak velocity

MT Movement time

F-B Front-Back

CE Constant error

R Reliability

Page 15: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

xiv

Acknowledgements

I would like to acknowledge the large contribution of my Principal Supervisor Dr

Charles Worringham to this project; his patience and continued support, have

been invaluable. I would also like to acknowledge my Associate Supervisor Tom

Cuddihy for his timely advice throughout the year and his help keeping my aims

within the realms of possibility during the early stages of the study. I would like

to thank Alan Barlow for his technical support.

Thanks to my friends and family for putting up with me not only during two

years, but during I have studied in Australia including English learning periods.

A special thanks to Sato for encouraging me and devoted support for my life.

Page 16: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 1 -

Chapter 1. INTRODUCTION and LITERATURE REVIEW

1.1 Introduction

In open-skill sports such as soccer, basketball, handball and rugby football,

recognising the play and making decisions can be as important as physical

performance. Many studies (Abernethy, 1991; McMorris, 1997; Nougier & Rossi,

1999) have investigated the recognition of game situations and decision making

in ball games. Most of these studies recorded eye movements and evaluated the

participant’s gaze position. They have indicated that a visual search strategy is

essential for recognition of the game situation. However, visual perception is not

the only method for acquiring information; auditory perception can also play a

role. This occurs, for example, when players in games like soccer or basketball

indicate their location and readiness to receive a pass. There has been little

research on auditory perception as one aspect of decision making, whether in

sport or other situations, and this will be the focus of the current study.

In open-skill games, players have to recognise the game situation to select the

appropriate play (Abernethy, 1991; Williams & Grant, 1999). For example, in

rugby, there are various factors which determine the proper play, such as the

position of the ball and player, the number of opponents and support players,

time remaining, the score and even the weather conditions. In particular, the

number and position of the opponents and support players, and the location of the

ball are consistently important factors for the ball carrier in determining what

play to make. The opponents are usually in front of the ball carrier, whereas the

support players are behind him/her because of the unique rules of rugby football.

Therefore, the ball carrier must pass backwards. There is some consensus among

Page 17: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 2 -

coaches that the ball carrier appears to gather information about the number of

opponents and their positions by visual perception, whereas information about

support players seems to be acquired not only by visual but also by the auditory

sensory system. As the reason for this use of auditory information is that the ball

carrier cannot look at both players in front and behind at the same time in rugby,

verbal communication is one of the important elements of the game. Auditory

perception is essential to localize the position of support players, especially for

the ball carrier. Not only can a player localize a team-mate by direct auditory

localisation if that player calls, the call itself may contain spatial information. For

example, “Left” or “Behind” indicate the relative position of the support player,

whereas some calls provide no absolute spatial information (e.g. “Here”, or the

player’s name). Whether such semantic spatial information is actually useful is

not known. Therefore this study will test the assumption that semantic spatial

information can be used to localize a person’s position. Specifically, it will test

whether semantic spatial information improves auditory localisation compared to

stimuli with no semantic spatial content. While many situations in competitive

sports have potential conflict between semantic spatial stimuli and pure auditory

localisation, and these situations were the original motivation for this study, they

may also occur in many other settings, for example, public address systems,

auditory warning systems in vehicles and on machinery. Therefore, while some

of the following literature review emphasises decision-making in sport, the

problem is a more general one.

In decision making in sports situations, many studies have examined the relation

between perception and cognition, particularly visual perception (Abernethy,

Page 18: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 3 -

1991; Gardner & Sherman, 1995; Helsen & Sherman, 1999; Starkes, 1987;

Williams, Davids, Burwitz, & Williams, 1994; Williams & Grant, 1999).

Abernethy (1991) examined the influence of vision on the selection of plays by

comparison between expert and novice players. In this research, it was shown

that expert players were significantly faster in reacting to a tennis serve and more

precise in anticipating the next play than novices. In addition, Williams and

Grant (1999) have examined the difference between experienced and

inexperienced soccer players in visual search strategies. They indicated that

experienced players fixate on a smaller set of points in 1-on-1 and 3-on-3 in

defensive game situations, than novices. Nakagawa (1985) discussed the factor

of structure in decision making, and emphasised, in discussing information

processing in ball games, the perception and recognition stages, as these are the

first and essential stages in decision making. Unlike the well-studied topic of

visual perception in sport, there are few studies of auditory perception and

recognition of game scenes. An interview study of university soccer players

(Daus, Wilson, & Freeman, 1989) investigated the influence of perception on

mental strategies, such as decision making skill, creativity, and memory. The

results indicated that the auditory sense is least utilized. Creativity and decision

making are dominated by the visual sense. However, one key feature of auditory

perception in sport is the localisation of players.

In basic research on auditory localisation, by contrast, a large number of studies

have been conducted. Auditory localisation studies have almost exclusively

focussed on underlying processes, such as Interaural Time Difference and

Interaural Level Difference. It is not known, however, whether semantic spatial

Page 19: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 4 -

information contained in the sound is actually used, and whether it assists pure

auditory localisation. This will be assessed using semantic information in

conditions where the true location of the sound does or does not agree with the

location indicated by that semantic information, and determining whether this

affects auditory localisation. The aim of this study is to investigate the effect on

auditory localisation of spatial (and non-spatial) semantic information.

1.2. LITERATURE REVIEW

1.2.1. Auditory Localisation

Auditory localisation studies have been conducted in the psychological or

physiological fields for a century, and have usually considered three different

coordinate systems: direction, elevation and distance. Direction and elevation

were defined by Knudsen, Hasbroucq, & Osman (1982) as, respectively, the

angle given by the sound source, the centre of the listener’s head, and the median

plane in the horizontal dimension (i.e. horizontal judgments), and the angle given

by the sound source, the centre of the head, and the horizontal plane, (i.e. vertical

judgments). Direction judgments are the focus of this study of the role of

semantic spatial cues, as they are more likely to occur in this dimension than in

the vertical plane. Distance judgments are of less relevance to the topic of this

study.

Many studies have been performed on directional localisation in the past century.

Interaural Time Difference (ITD) and Interaural Level Difference (ILD) are

mechanisms for localisation which were originally put forward by Lord Rauleigh

(Begault, 1994). Interaural differences refer to differences in properties of the

Page 20: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 5 -

stimuli reaching the left and right ears. A large number of sound localisation

studies were stimulated by these theories. There are several findings that have

been proposed as evidence for ITD and ILD, outlined below.

1.2.1.1. Interaural Time Difference

Sounds originating from many locations in space reach one ear before the other.

The difference between the time that the sound reaches the left and right ears is

called Interaural Time Difference, (Wightman & Kistler, 1989a) see Figure 1.1.

When the sound is located in front of the listener, as the distance to each ear is

the same, the sound reaches the left and right ears at the same time. However, if a

source is located on the right side, the sound reaches the right ear before it

reaches the left ear. The maximum time difference from near ear to far ear

averages 650 microseconds (Begault, 1994), and it has been shown that human

beings can detect 10 microsecond differences (Durlach & Colburn, 1978). Thus

auditory localisation can be based on the time difference between the sound

reaching the near ear and far ear. However, ITD does not always occur in every

sound. Some studies (Klump & Eady, 1956; Zwislocki & Feldman, 1956) have

shown the influence of different kinds of stimuli on localisation performance by

manipulating the frequency of the stimuli. Henning (1980) has shown that when

stimuli are given at frequencies higher than approximately 1.5kHz, humans are

not capable of detecting any difference in the times of arrival of the sound at

each ear, i.e. they cannot detect an ITD. The ITD does not function well with

high frequency sound, so other difference cues are used for auditory localisation.

Page 21: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 6 -

Figure 1.1 Interaural Time Difference & Interaural Level Difference.

1.2.1.2. Interaural Level Difference

The other binaural cue is the Interaural Level Difference, that is, the different

frequency of the sound that reaches the two ears because a listener’s head

interrupts the path from the source to the far ear (Middlebrooks & Green, 1991).

The head creates an acoustic shadow that interrupts high frequencies to the far

ear (Figure 1.1). The sound’s wavelength determines the amount of the

shadowing, as it may be larger or smaller than the subject’s head. There is little

difference in intensity for frequencies below about 1000 Hz, but quite large

differences in intensity occur for higher frequencies. Therefore, it is thought that

ITD is used for low frequency sound and ILD is used for high frequency sound.

(Middlebrooks & Green, 1991)

1.2.1.3. Front-Back Confusions and the Cone of Confusion

It has been shown that both ITD and ILD are important parameters for auditory

localisation in the directional plane (e.g., left and right). Although directional

auditory localisation can be explained by ITD and ILD theories, listeners

sometimes make judgment errors in localising the sound source, for example,

Acoustic shadow

Page 22: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 7 -

front-back and elevation ambiguity, and ambiguity of rear space localisation.

Figure 1.2 shows that systematic confusion exists in auditory localisation, termed

the cone of confusion (Begault, 1994). A sound source at position A would

produce an identical ITD and ILD as a source at position B and similarly for

sources at positions X and Y. The ITD and ILD from A are the same as from B,

because they have the same distance and angle from each ear. The ability to

localise sound sources within the cone of confusion is thought to be assisted by

spectral cues. These spectral cues will differ because of the asymmetry of the

pinna, even though ILD and ITD are not different, as explained below.

Figure 1.2. The cone of confusion.

1.2.1.4. The Head-Related Transfer Function

Though there is ambiguity caused by the cone of confusion, the listener can

usually differentiate sound originating from points inside the cone (i.e. front or

behind, and above or below). Therefore it has been suggested that the ability to

localize sounds, especially in the median plane, is evidence for a monaural

hearing mechanism, termed ‘Head-Related Transfer Function’ (HRTF). The

HRTF mechanism relies on the spectral coloration of a sound produced by the

A B

Y

X

A

B

Page 23: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 8 -

torso, shoulder, head, and external ear or pinna (Geierlich, 1992). The HRTF

effects depend on the frequency of the sounds and on the direction and plane of

the sound. In the range 100Hz-2 kHz, for example, Genuit (1984), cited in

Begault, (1994), reported that the upper body and shoulders have an effect that is

influenced by direction, and that this is especially the case in the median plane.

Blauert (1983) revealed that the characteristics of the cavum conchae led to a

difference in the effects of sounds coming from in front and behind the subject.

At 10 kHz, this difference was about 5 dB. Some studies have shown that errors

are greatest in rear space, because the frequency of the sound from behind is

changed due to the pinna (Middlebrooks, 1992; Oldfield & Parker, 1984, 1986).

1.2.2. Semantic Localisation

Human beings may be able to use not only physical sound but also verbal

semantic information. In actual life, the most important sound stimulus is often a

word. As previously mentioned, players usually communicate with each other to

make decisions in team ball games. Verbal information may be utilized for

appreciating the surrounding situation. However, while semantic information

may have an advantage by making it easier to localize the position and imagine

the situation, it may also have the disadvantage of potential interference in

reacting to semantic stimuli.

1.2.2.1. Possible Advantage of Semantic Information for Localisation

Recent studies have investigated the influence of various stimuli on sound

localisation (Klatzky, Lippa, Loomis, & Golledge, 2002, 2003; Loomis, Lippa,

Golledge, & Klatzky, 2002; Muller & Bovet, 2002; Muller & Kutas, 1996).

Page 24: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 9 -

Muller and Bovet (2002) investigated sound localisation using the participant’s

name as a stimulus. Results indicated reaction times were shorter when the

participants localised their own first name than any other first names, however

the response accuracy for one’s own name was not significantly more precise

than for any other names. Loomis et al (2002) have also studied the influence of

spatial language and sound localisation in a navigation task. Spatial stimuli, such

as “1 o’clock, 3m” create a more spatial image and lead to more precise

localisation than 3-D sound. With 3-D sound stimuli, the participants

underestimated the sound distance. Klatzky et al. (2002) found that spatial

language significantly benefited directional localisation and distance judgment.

This study demonstrates that spatial semantic terms indicating both direction and

distance can be used to assist in navigation. However, it is reasonable to suppose

that the accuracy of performance depends on how specific the information is, and

whether the subject is familiar with the spatial units used (e.g., “metres”).

1.2.2.2. Interference by Semantic Stimuli

In contrast to the increased accuracy for localisation shown by Loomis (2002),

semantic information could theoretically degrade performance if rapid reactions

to a stimulus are required, and if the semantic spatial stimuli do not match the

sound source. Like other types of stimuli, semantic spatial stimuli can have

many features or “dimensions”. When stimulus dimensions have overlap

(Kornblum, Hasbroucq, & Osman, 1990), they can be either congruent, in which

case one dimension (e.g., side of stimulus) and another dimension (e.g., meaning

of stimulus) are the same; or incongruent, meaning that these dimensions are

different. The interference between incongruent stimulus dimensions when this

Page 25: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 10 -

dimensional overlap is present delays RT and reduces accuracy of responses. In

this study, the tendency for performance to be affected by the congruence of

stimulus dimensions will be used to help determine whether semantic spatial

information in verbal stimuli is processed during auditory localisation.

1.2.2.3. Semantic Information

There are many studies which have investigated semantic information processing,

in tasks other than localisation, such as semantic priming (Neely, 1991). As

described by McNamara (2005), semantic priming refers to the faster and more

accurate responses to a stimulus when that stimulus and the one that precedes it

are semantically related. For example, responses to the word “dog” will be faster

if the previous stimulus is semantically related (e.g. “cat”) than if it is not (e.g.

“table”). Therefore the stimulus semantically related to the response facilitates

the speed and accuracy of that response. Semantic priming has been extensively

investigated in studies of lexical and other forms of decision-making. If reaction

time and accuracy can be influenced by semantic properties of stimuli even when

they are not the “target” stimulus, it is reasonable to expect that reaction time and

accuracy of localisation will also be affected by semantic properties of stimuli.

1.2.2.4. Taxonomy of S-R Compatibility

A classic taxonomy (Kornblum et al., 1990) is founded on the concept of

dimensional overlap and classifies stimulus-response (S-R) compatibility into

eight categories. This classification is determined by whether overlap exists

between the relevant and irrelevant stimulus dimensions, between the relevant

stimulus dimension and the response dimension, and between the irrelevant

Page 26: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 11 -

stimulus dimension and the response dimension. In each case, ‘relevant’ refers to

the dimension which participants are to base their responses on; ‘irrelevant’

refers to any other dimension. In the next sections, some examples involving

spatial dimensions are outlined.

1.2.2.5. The Simon Effect

The Simon effect is classified as a Type 3 ensemble, that is, the relevant stimulus

dimension has no overlap with the response dimension but the irrelevant stimulus

dimension does (Kornblum et al., 1990). In the Simon task, for example, the

relevant stimulus dimension is a non-spatial feature, such as colour or shape,

assigned to left and right responses, and the location in which the stimulus occurs

is irrelevant (Simon, 1990). This effect was first described in experiments using

auditory stimuli (Simon, Craft, & Webster, 1973; Simon & Small, 1969). In this

research, participants made left or right key-presses to low- or high-pitched tones.

On any trial, the tone was presented to either the left or right ear. Responses to

the “right” (i.e. high-pitched tone) were 62 msec faster when it was heard in the

right ear rather than the left ear, while responses to the “left” (i.e. low-pitched

tone) were 60 msec faster when the stimulus was presented in the left ear rather

than the right ear. Even with practice, the Simon effect is not eliminated. Lu and

Proctor (1995) indicated that the Simon effect represents a fundamental aspect of

human information processing.

Moreover, in a related Simon-like task (not using auditory stimuli), participants

are asked to give lateralised responses on the basis of a non-spatial feature of

stimuli that are presented in different locations. For instance, participants might

Page 27: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 12 -

be instructed to press a left key when they see a green circle and to press a right

key when they see a blue circle. The green and the blue circles are presented on

the left or the right side of the screen. Results have shown that participants are

faster in pressing the left key in response to a stimulus on the left side than to a

stimulus on the right side, while the reverse is true when participants must press

the right key.

1.2.2.6. The Stroop Effect

The normal Stroop effect is evident in tasks which require the naming of

coloured words. This effect refers to interference between the name of the colour

and the name of the word (Stroop, 1992). This task is classified as a Type 8

ensemble in Kornblum’s taxonomy (Kornblum et al., 1990). The participants in

such experiments are slower to name the colour of the ink in which an

incongruent colour word is printed relative to a control condition of naming

coloured squares. However, reading the colour word was less affected by the ink

colour in which it was printed.

1.2.2.7. Spatial Stroop Effect

In a variation of the Stroop task, spatial interference has been investigated. This

is referred to as the “spatial Stroop effect” and is of particular relevance here.

However, only a very small number of these studies are relevant to the current

study.

White (1969) used directional word stimuli, NORTH, SOUTH, EAST, and

WEST. The words were presented inside a square, at the top, bottom, left, and

Page 28: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 13 -

right, implying the direction, respectively. The participant reported the location

where the stimulus appeared by saying the appropriate direction name verbally.

As a control condition, a row of asterisks (****) was used. Interference scores

were calculated by taking the ratio of the time to respond to a list of 80 items for

the incongruent condition with respect to the time to respond to a list of the same

length in the control condition. The score was 1.2 for the naming responses but

close to 1.0 for the manual responses, indicating that an incongruent word slowed

vocal but not manual responses to stimulus location. In addition, Seymour (1974)

examined that the words, ABOVE, BELOW, LEFT, and RIGHT, presented to

the participants on a monitor. These words affected the naming of the position of

the word relative to a central dot (Figure 1.3).

Figure 1.3. Spatial Stroop task. Congruent stimulus (left), and incongruent stimulus (right).

Responses to the word’s location were particularly slow when the word specified

was in the opposing location on the same dimension. Moreover, Shor (1970) put

the words, LEFT, RIGHT, UP and DOWN into arrows pointing in directions

incongruent with the inserted word and obtained interference in naming the

LEFT

LEFT

Congruent Stimulus Incongruent Stimulus

Page 29: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 14 -

direction of the arrow. Two task conditions were used, e.g., Arrow task condition:

subjects were instructed to identify (name) the direction of the arrow ignoring the

word. In the Word task condition subjects were instructed to identify (read) the

words, ignoring the surrounding arrows. The results of these studies showed that

RT for naming the direction in which the arrow pointed was slower than naming

the direction word. In addition, naming the direction of the arrow was slower

when the word in the arrow was incongruent, for example, the arrow which

contained the word RIGHT pointing to the left location. However the naming of

the direction word was not significantly slower when the arrow was incongruent.

This suggests that word meaning can interfere with naming a direction that is

indicated symbolically, but not the reverse. In these previous studies, however,

the responses involved naming or saying the location or direction verbally.

Naming and saying the word stimulus were not affected by the incongruent

spatial dimension because these incongruent features or stimuli are not the word

but a shape or location. In other words, the participant has to translate the shape

or location into a word in order to say it, but can use the word directly for a

response to a word stimulus. This leaves it unclear whether the same asymmetry

(the incongruent word slowing performance but not the incongruent arrow)

would be found if spatial, rather than verbal responses were used. In the present

study however, orienting movements were used as the response. Unlike Shor’s

task, in which reading words was not affected by incongruent arrows, this

orienting task could be expected to show more clear-cut interference.

Palef and Nickerson (1978) examined the influence of spatial semantic

information on the speed and accuracy of button-presses, using an auditory

Page 30: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 15 -

spatial Stroop-like task. They used four spatial word stimuli: “Left”, “Right”,

“Front” and “Back”, and a neutral stimulus “X”, delivered from speakers located

at the Left, Right, Front and Back of the subject. This study was composed of

two conditions, Location and Word. In the Location condition, subjects were

required to respond to the location of the word, by pressing the appropriate

button, while attempting to ignore its meaning. By contrast, in the Word

condition, subjects needed to respond to the meaning of the word stimulus

regardless of its location. As before, when the stimulus dimension corresponded

to the location dimension (e.g., a stimulus “Left” was presented from Left

location), the condition was labelled as congruent. On the other hand, when the

stimulus and location dimensions did not correspond (e.g., a stimulus “Left” was

presented from Right location), this represented an incongruent condition. This

study compared the reaction time for these congruent and incongruent conditions,

using both Location and Word instructions. They found that there were effects of

congruency in the Location but not in the Word condition. Relative to the neutral

condition, these effects included both facilitation for congruent conditions, and

interference, for incongruent conditions. However, these effects were confined

to stimuli presented at the Front and Back locations. The authors concluded that

the results supported the proposal that a necessary condition for the occurrence of

Stroop-like effects is that the irrelevant information is processed faster than, or at

least as fast as, the relevant information. This would explain why they found no

effect of congruence in the Word condition, because they claim that Word

meaning was processed faster than location (at least for Front and Back

conditions). However, this interpretation may be questioned. It relies on

comparing the neutral stimuli in the two conditions. In the Word condition, the

Page 31: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 16 -

neutral stimulus was a single spatial word (e.g. Left), delivered from all speakers

simultaneously. This is different from the neutral stimulus in the Location

condition.

Another limitation of this study, however, is the fact that it used a key-press

response. A key-press is less closely related to the stimulus, and, in order for the

correct button to be chosen, the task may need more cognitive manipulation than

an orienting movement. An incongruent verbal stimulus in a Word condition (e.g.,

the call of “Left” from the right side) might affect auditory localisation if this

involves an orienting movement. In addition, this study made no mention of

front-back confusions in the results or discussion. If the speaker was positioned

in front and back, specifically on the midline, front-back confusion could be

expected to affect the responses. The present study was concerned with both

front-back confusion and the spatial Stroop effects in orienting movements.

In summary, it is known that humans can localise the source of the pure tone

sound using three different processes that are independent of any semantic

content. However, although it is known that spatial semantic information affects

auditory localization in tasks such as key-pressing, it is unclear if the same is true

for more natural responses, such as whole-body orienting movements. In addition,

measuring the whole movement would provide additional measures than just

reaction time, and these may make clarify the underlying processes.

Page 32: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 17 -

Chapter 2. EXPERIMENT ONE

2.1. Introduction

The aim of this study was to investigate the effects of spatial and non-spatial

semantic information on sound localization using orienting movements. This

research used methods similar to those of Palef and Nickerson (1978), however,

it differed in several important ways, especially in that a whole body orienting

movement was used, rather than simple button-pressing. Turning to orient to a

sound is a far more typical response and may not make use of the same processes

as those in the button-pressing task, because the sound source directly indicates

the absolute spatial target for orienting, but only indicates the same relative

position for button-pressing. This is because all the buttons were located in front

of the participant in the Palef and Nickerson (1978) study. The latter task may

therefore require additional transformations between stimuli and responses.

2.1.1 Definition of Terms

A non-spatial semantic, or neutral, stimulus is defined as any word that can be

localized through auditory perception, but which has no spatial meaning, such as

the word “Yes”, while a spatial semantic stimulus means a word which has

spatial meaning such as “Left, Right, Front, and Back”. Congruent refers to a

situation in which, for instance, the spatial stimulus “Right” comes from the right

side, or in which the position of a sound source is the same as the semantic

spatial cue (Figure 2.1). In addition, incongruent is defined as a situation in

which the position of the sound source is different from the spatial semantic

stimulus, such as the word “Right” coming from the left, front and back location.

A physical stimulus refers to a non-verbal sound, such as a tone.

Page 33: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 18 -

Figure 2.1 Left and right congruity between spatial semantic stimulus and position of the sound source: ↑ shows direction in which participant is facing in initial position.

2.1.2. Research Question

Does the spatial semantic content of a word affect the auditory localisation of

that stimulus? By using stimuli with or without spatial semantic content, and by

making the semantic content either congruent, incongruent, or neutral relative to

the actual stimulus location, it was possible to answer this question.

Congruent

Incongruent

Neutral

Congruence Location of spatial semantic stimulus

“Front”

“Back”

“Left” “Right”

“Back” “Left”

“Right”

“Front” “Left”

“Right”

“Right” “Front” “Back”

“Left” “Front” “Back”

“Yes”

“Yes”

“Yes” “Yes”

Page 34: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 19 -

2.1.3. Hypotheses

2.1.3.1. Hypothesis 1

The speed and accuracy of an orienting movement to verbal stimuli depend on

both auditory localisation of the actual sound and spatial semantic localisation if

it has spatial semantic content. The rationale for this hypothesis is that if spatial

semantic information is not used, then performance will be unaffected by the

addition of spatial semantic content.

The predictions of this hypothesis that were tested are:

1a) Reaction time will be longer when spatial semantic cues and auditory

location cues contained in a stimulus are incongruent rather than neutral

or congruent.

1b) Directional accuracy will be lower when spatial semantic cues and

location cues contained in a stimulus are incongruent rather than neutral

or congruent.

2.1.3.2. Hypothesis 2

Although it was not the primary aim of this study, the experiments offered an

opportunity to test predictions about the occurrence of front-back confusions.

The following hypothesis was formulated:

When auditory location cues and spatial semantic cues are incongruent,

confusion between front and rear occurs more than for left and right. The

rationale for this hypothesis is ambiguity of absolute front and rear space because

of ITD and ILD effects.

Two predictions of this hypothesis were tested:

Page 35: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 20 -

2a) Reaction time will be longer for front and back than for left and right

positions.

2b) Directional accuracy will be lower for front and back than left and

right positions. Directional judgments will tend to be opposite the front

and back positions (i.e. reversed around the left-right or transverse axis).

2.2. Method

2.2.1. Participants

Ten Queensland University of Technology students aged 18-32 (3 women, 7 men)

who reported no auditory problems were selected as participants. All recruitment

and other procedure were approved by the QUT Human Research Ethics

committee.

2.2.2. Experimental environment

The experiment took place in a semi-reverberant room measuring 9.5m in width,

5.25m in length and 3.40m height. Loudspeakers were placed 2.5m away from

the participant’s standing location in four positions, front, back, left, and right at

a height of 1.5m (Figure 2.2a). A visual surround was positioned between the

participant and the loudspeakers to occlude the participant’s view of the

loudspeakers. The visual surround comprised sections of dark grey cloth draped

from a series of 12 wooden frames, centred on the participant’s standing position.

The visual surround was 2.25m in diameter.

Page 36: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 21 -

Figure 2.2(a). Layout of apparatus: Overhead view.

Figure 2.2(b) Layout of apparatus: Lateral view.

Visual surround

Loudspeaker

Reference Position

2.5m

Left

Back

Front

Right

Loudspeaker

Reference Mark

Visual Surround

Page 37: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 22 -

2.2.3. Apparatus

Stimulus presentation and data collection were controlled by a Toshiba Computer,

using a custom program written in the Labview programming language (Version

6.0, National Instruments). The directional error and reaction time were

measured from head movements recorded by the POLHEMUS Fastrak motion

analysis system, which permitted the 3-dimensional position and 3-axis

orientation of a head-mounted sensor to be tracked at 60Hz. A sensor was

mounted on a tight fitting skull-cap worn by the participant.

2.2.4. Stimuli

The verbal stimuli were recorded in a male voice, and were presented through

one of eight loudspeakers. The duration of each stimulus was approximately

500ms and its intensity was 64dB, measured with an industrial sound meter.

2.2.5. Design and dependent variables

The experiment had two main conditions (Location and Word). The form of the

Location condition was a 6 (stimulus type) × 4 (position) pseudo-randomised

design. There were 5 repetitions of each of these 24 conditions. The independent

variables were stimulus type (the words “Left”, “Right”, “Front”, “Back” “Yes”,

and a physical sound (Tone), position (Left, Right, Front, and Back). Front is

aligned with the reference position shown in Figure 2.3. The Word condition was

a 4 (stimulus type; the words “Left”, “Right”, “Front” and “Back”) × 4 (position:

Left, Right, Front and Back) design, with each combination again having five

repetitions. For both conditions, each combination of stimulus and location was

Page 38: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 23 -

used once before any repetition occurred, making 5 consecutive blocks of trials,

each containing all combinations.

2.2.5.1. Reaction time

Reaction time was defined as the interval between the sound being presented and

the participant starting to move. This was obtained in post-collection data

analysis with a computer program algorithm which identified the first sample

changing by more than 0.5 degrees in rotation or inclination, the lowest practical

threshold based on pilot testing. A graphical display of the kinematics of each

movement allowed visual checking of the point at which the movement was

identified as starting.

2.2.5.2. Initial rotation direction

The initial rotation direction (IRD) was determined as the direction of the

movement (clockwise or anticlockwise) in the first 100 ms following movement

onset. It was expressed as an index by calculating the frequency of clockwise

movements across repetitions within subject, such that 1 = 100% clockwise

movements and 0 = 100% anticlockwise movements.

2.2.5.3. Rotation direction at Peak Velocity

The direction of motion at the time of peak rotation velocity (RDPV) was

analysed in the same way as IRD.

2.2.5.4. Movement time

The Movement time was determined as the time from the initial movement to the

end of the movement.

2.2.5.5. Constant error

Directional error (or Constat error, CE) was defined as the difference between

the target and judgment positions, and was indicated as positive (+nº) for errors

Page 39: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 24 -

in a clockwise direction, and negative error (-nº) for a anti-clockwise direction.

Figure 2.3 shows this directional error calculation. If the participant judged the

target sound source to be in position A (Figure 2.3) relative to the actual (target)

sound source, the error from location B (Figure 2.3) would be -18º.

2.2.5.6. Reliability

The consistency of directional error was calculated as Reliability (R). Reliability

was calculated through the use of circular statistics, in which highly consistent

performance approaches a value of one, and inconsistent performance

approaches zero. The actual value is the length of the resultant vector of all the

repetitions within a condition for each subject. This measure is preferred to

Variable Error in situations where errors of, for example, -179 degrees and +179

degrees can occur in the same condition, producing very high VE values even

though they differ by only two degrees. Reaction time was measured using data

from the head-mounted sensor.

Figure 2.3. Directional error calculation illustrated for a “Back” target. A and B depict two possible head orientations relative to the target shown by the loudspeaker (L).

B A

L

+10º -18º

Back

Page 40: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 25 -

2.2.6. Procedure

The participant was instructed in the task and, at the beginning of each trial,

stood facing a reference mark at eye level in the front-facing initial position. The

stimuli were presented in a random order by the laptop computer. Participants

turned to face the sound source as quickly and accurately as possible and

remained still in the Location condition, whereas participants turned and faced

the location indicated by the word stimulus in the Word condition. The order of

the Word and Location conditions was counterbalanced across participants.

Participants were instructed to face squarely the location appropriate for each

condition as instructed by the experimenter. Participants had to turn their whole

body towards that location. Special instructions were given for cases where the

participant judged the relevant position to be straight ahead. Since a movement

of some type was required in order for reaction time to be measured, the

participant was asked to nod his or her head if he or she judged that the sound

source was straight ahead. Participants could make corrective movements within

the 2.5s sampling period. The 3D position and orientation of the participant’s

head was recorded throughout each trial. This position was referenced to the

straight ahead (Front) position. Participants were given a short break (three

minutes) between blocks of trials. A set of five practice trials was presented first.

Blocks of localisation trials followed, each composed of 24 random presentations

of stimuli through each of the loudspeaker locations.

2.2.7. Analysis and statistics

The “Yes” and tone stimuli in the Location condition were first compared with

the other stimuli in a 6 × 4 repeated measure analysis of variance. Since there

Page 41: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 26 -

were “Yes” and tone stimuli only in the Location condition, the main analyses

were then conducted using only the four semantic spatial stimuli in a 2 (location

vs. word instruction) × 4 (stimulus location) × 4 (stimulus meaning) repeated

measures ANOVA. In addition to RT and Directional Error, the following

dependent variables were obtained: the proportion of clockwise and anti-

clockwise movements, the initial direction of movement, peak rotational velocity

and movement time. Planned comparison analyses were performed for specific

pairwise comparisons, and Fisher’s post-hoc test was used to further examine

significant effects from the ANOVAs. In all figures, error bars represent 95%

confidence intervals.

2.3. Results

2.3.1. Reaction Time (RT)

The first analysis compared, in the Location condition, the two “control” stimuli

(“Yes” and Tone), with the four stimuli that have semantic spatial content

(“Left”, “Right”, “Front” and “Back”). The word “Yes” acts as a control for

semantic spatial content, since it is verbal but its meaning does not refer to any

location. The tone stimulus has no semantic content of any kind. The reaction

times (RTs) for these two control stimuli did not differ significantly from the

spatial semantic stimuli, F(5,45)=.45, p=.81, nor was there any interaction

between Stimulus and Location, F(15,135)=.73, p=.74. These data are shown in

Figure 2.4. Subjects took longer to start their orientation movements when

instructed to use the location of the stimulus, rather than its meaning. The overall

mean reaction time for the Location condition (619 ±135ms) was significantly

longer than for the Word condition (539 ±133ms), F (1,9) =5.33, p<.05. This is

Page 42: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 27 -

shown in Figure 2.5, which also depicts the RT for each stimulus (the word

“Left”, “Right”, “Front” and “Back”).

"Left" "Right" "Front" "Back" "Yes" Tone

STIMULUS

200

300

400

500

600

700

800

900

1000

1100R

eact

ion

Tim

e (m

s) LOC Left LOC Right LOC Front LOC Back

Figure 2.4 Reaction time for each stimulus in the Location condition (Points represent mean values; errors bar in all figures represent 95% confidence intervals). Values for each location are joined by lines.

Page 43: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 28 -

"Left" "Right" "Front" "Back"STIMULUS

200

300

400

500

600

700

800

900

1000

1100

Rea

ctio

n Ti

me

(ms)

Location Condition Word Condition

Figure 2.5 Reaction Time for each stimulus in the Location and Word conditions. The “Front” stimulus in the Word condition had significantly longer RTs than the

other three stimuli, F(2,27)=31.47,p<.00001). In addition, post-hoc analysis

showed that reaction times for each location were similar in the Word condition,

whereas RTs for front and back positions were both significantly longer than for

left and right positions (p<.01). These values are shown in Figure 2.6. A specific

comparison was made of RTs for congruent and incongruent conditions as a test

of Hypothesis 1a. This stated that “Reaction time will be longer when semantic

spatial cues and auditory location cues contained in a stimulus are incongruent

rather than neutral or congruent”. This involved comparing RTs for the word

“Left” delivered from the left (or “right” from the right), i.e. congruent

conditions, and those for the word “Left” delivered from the right (or “right”

from the left), i.e. incongruent conditions.

Page 44: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 29 -

Left Right Front BackSTIMULUS LOCATION

200

300

400

500

600

700

800

900

1000

1100

Rea

ctio

n Ti

me

(ms)

Location Condition Word Condition

Figure 2.6 Reaction time for each location (Left, Right, Front and Back) in the Location and Word conditions.

These RTs are shown in Figure 2.7. There was a strong effect of congruence in

the Word condition with congruent RTs about 76ms shorter for congruent

conditions. This pattern of RT data confirmed hypothesis 1a, but only for the

Word condition. This was the only “pure” analysis of the spatial Stroop effect in

this experiment, as any comparisons involving Front and Back locations are,

potentially, affected by Front-Back confusions and, for the Front locations, the

unique requirement of nodding the head rather than rotating.

Page 45: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 30 -

LOCATION CONDITION

Congruent Incongruent300

350

400

450

500

550

600

650

700

Rea

ctio

n Ti

me

(ms)

WORD CONDITION

Congruent Incongruent

STIM-LOC Left STIM-LOC RIght

Figure 2.7 Reaction Time for congruent and incongruent stimulus-location pairs (“Left” from left and right side, and “Right” coming from left and right side)

For the Location condition only, and for just the Left and Right positions, it was

possible to compare the congruent and incongruent RT values with those for the

neutral stimuli (“Yes” and Tone). The results were very similar for comparisons

with both the “Yes” and Tone stimuli. The neutral stimuli were longer than

either congruent or incongruent values (averaged over left and Right locations):

Congruent 427 ms, Incongruent 432 ms, Yes 473 ms, Tone: 476 ms. (These

values can be obtained from the appropriate parts of Figure 2.4). To examine

whether there was a difference between congruent, incongruent and neutral

stimuli, a separate 3 (congruence) x 2 (stimulus location: left or right) ANOVA

was run, and a planned comparison of the neutral stimulus against congruent and

incongruent values was calculated. This analysis was undertaken twice, once

using the “Yes” stimulus, and once using the Tone stimulus as the neutral

Page 46: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 31 -

condition. Comparisons involving the “Yes” stimulus showed no main effect of

congruence (F (2,18)=2.13, p=0.15), nor a significant planned comparison effect.

(F (1,9)=2.60, p=.14). Comparisons with the Tone stimulus showed that it had a

significantly longer RT (main effect of congruence: F (2,18)=5.24, p<.05;

planned comparison, F (1,9)= 6.70, p<.05).

2.3.2. Initial Rotation Direction.

In Figure 2.8, it can be seen that subjects almost always rotated anticlockwise for

Left locations and clockwise for Right locations in the Location condition, while

in the Word condition, they usually rotated anticlockwise for the “Left” stimulus

and clockwise for the “Right” stimulus. In both conditions, clockwise and

anticlockwise directions were chosen about equally often for Front and Back

locations (Location condition), and “Front” and “Back” stimuli (Word condition).

These results, overall, show that subjects generally rotated in the direction

required for the instructions they had been given. However, there was clear

evidence that, in the Location condition, subjects were not influenced by word,

but in the Word condition, they were influenced by location. This is clear in

Figure 2.8 (right panel). Regardless of the stimulus, subjects were significantly

more likely to turn in an anticlockwise direction if the location of the stimulus

was on the left than if it was on the right, planned comparison: F(1,9) = 6.08,

p<.05.

Page 47: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 32 -

LOCATION CONDITION

LOC:Left

RightFront

Back-0.2

0.0

0.2

0.4

0.6

0.8

1.0

1.2

Initi

al R

otat

ion

Dire

ctio

n

WORD CONDITION

LOC:Left

RightFront

Back

STIM "Left" STIM "Right" STIM "Front" STIM "Back"

Figure 2.8 Initial Rotation Direction in the Location and Word conditions. (0 and 1 indicate 100% frequency of anti-clockwise and clockwise movements, respectively.)

2.3.3. Rotation Direction at Peak Velocity

The patterns of Rotation Direction at Peak Velocity are similar to those for IRD

with one important difference (Figure 2.9). In the Word condition, the difference

between Left and Right locations described above for IRD are much smaller for

the “Front” and “Back” stimuli, and not present at all for the “Left” and “Right”

stimuli. The Stroop interference shown for IRD in the Word condition is much

reduced or absent by the time of peak velocity.

Page 48: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 33 -

LOCATION CONDITION

LOC:Left

RightFront

Back-0.2

0.0

0.2

0.4

0.6

0.8

1.0

1.2

Rot

atio

n D

irect

ion

at P

eak

Vel

ocity

WORD CONDITION

LOC:Left

RightFront

Back

STIM "Left" STIM "Right" STIM "Front" STIM "Back"

Figure 2.9 Rotation Direction at Peak Velocity in the Location and Word conditions 2.3.4. Movement Time

No significant difference for movement time between the stimuli was found in

the Location condition, while movement time for the “Back” stimulus was

significantly longer than any of the other stimuli in the Word condition, F(3,

27)=11.79, p<.00005, Figure 2.10. Movement time did not differ significantly

either between conditions (Word vs. Location) or between the four locations

(Figure 2.11).

Page 49: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 34 -

"Left" "Right" "Front" "Back"

STIMULUS

400

600

800

1000

1200

1400

1600

1800

Mov

emen

t Tim

e (m

s)

Location Condition Word Condition

Figure 2.10 Movement Time for each stimulus in the Word and Location conditions

Left Right Front Back

STIMULUS LOCATION

400

600

800

1000

1200

1400

1600

1800

2000

2200

Mov

emen

t Tim

e (m

s)

Location Condition Word Condition

Figure 2.11 Movement Time for each location in the Word and Location conditions

Page 50: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 35 -

2.3.5. Front-Back Reversals

Table 2.1 and 2.2 shows the number and percentage of front-back reversals. The

only factor affecting these errors was the location of the stimulus, with the errors

occurring overwhelmingly for the back positions, i.e. back-front reversals. This

pattern confirms hypothesis 2b, which predicted more front-back than left-right

reversals. However, the hypothesis was confirmed only for the Location

condition.

Table 2.1. Frequency of front-back reversals in the Location condition (number and percentage of trial in each combination of location and stimulus). Stimulus "Left" "Right" Location Left Right Front Back Left Right Front Back Number 0 0 1 25 0 1 2 30 % 0 0 2 50 0 2 4 60 Stimulus "Front" "Back" Location Left Right Front Back Left Right Front Back Number 0 0 1 34 0 0 3 30 % 0 0 2 68 0 0 6 60 Stimulus "Yes" Tone Location Left Right Front Back Left Right Front Back Number 0 0 0 31 0 0 0 27 % 0 0 0 62 0 0 0 54

Table 2.2 Frequency of front-back reversals in the Word condition (number and percentage of trial in each combination of location and stimulus). Stimulus "Left" "Right" Location Left Right Front Back Left Right Front Back Number 0 0 0 0 0 0 0 0 % 0 0 0 0 0 0 0 0 Stimulus "Front" "Back" Location Left Right Front Back Left Right Front Back Number 0 0 0 0 2 0 0 2 % 0 0 0 0 4 0 0 4

Page 51: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 36 -

2.3.6. Constant Error

The directional error at the end of each movement was assessed through Constant

Error (CE). Front-back reversal errors were corrected according to the method of

Oldfield (1986). This involved “mirroring” the final position around the

participants’ inter-aural axis, and using the new value if this was smaller than the

original value. In effect, this procedure removes the front back reversal

component of the error. Overall, there were no systematic effects of conditions

on CE. Figure 2.12 shows values for Constant Error for combination of condition,

location, and stimulus. However, a wide range of errors was found for each of

the back locations in the Location condition, as seen in the large errors bars in

Figure 2.12. Note that in the Word condition, CE was very close to zero and

quite consistent, except for the “Right” stimulus, which had slightly larger errors.

LOCATION CONDITION

LOC:Left

RightFront

Back-6

-4

-2

0

2

4

6

8

10

12

14

16

Con

stan

t Err

or (d

egre

e)

WORD CONDITION

LOC:Left

RightFront

Back

STIM "Left" STIM "Right" STIM "Front" STIM "Back"

Figure 2.12 Constant Error for each stimulus and each location in the Location and Word conditions

Page 52: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 37 -

Even though CE was lower for congruent than for incongruent combinations in

the Location condition, (Left and Right stimuli and locations only) the difference

was not significant, planned comparison, F(1,9)=1.24 p=.30. Therefore,

Hypothesis 1b, which stated that errors would be lower for congruent conditions,

was not confirmed. (Values for conditions including the “Yes” and “Tone”

stimuli are shown in Appendix H and I. These confirm that accuracy did not

differ between the spatial and control stimuli).

2.3.7. Reliability

These values of Reliability (R) are shown in Figure 2.13. The Back location had

lower reliability (i.e. more trial-to-trial variability) than the other stimuli, but

only in the Location condition. This is shown in Figure 2.13, but could not be

formally analysed because of the extreme variations in variances for the four

stimuli.

Left Right Front BackLOCATION

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

1.1

1.2

Rel

iabi

lity

Location Condition Word Condition

Figure 2.13 Reliability for each location in the Location and Word conditions

Page 53: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 38 -

2.4. Discussion

The actual findings with regard to the spatial Stroop effect are explained below

by considering the results of each dependent variable, presented in chronological

order. Then the effect of Front-Back confusion on the auditory localization is

discussed with some variables in this experiment, even though the Front-Back

confusion is not directly related to the aim of this study. It should be noted that it

was not possible to present the Stroop data for RT as difference scores relative to

a neutral condition. This was because no neutral condition could be used in the

Word condition, and because, in the Location condition, any comparisons

involving Front and Back locations are potentially affected by the occurrence of

Front-Back confusions, and by the special case of a nodding movement for the

Front location. Thus absolute RT values only were examined.

2.4.1. Spatial Stroop Effect

This study investigated the effects of the spatial semantic cues for a localisation

task by testing for the presence of a spatial Stroop effect. The hypotheses were

also based on previous accounts of the spatial Stroop effects. It was assumed that

spatial semantic cues reduce reaction time and increase accuracy of the response,

if the relevant and irrelevant stimulus dimensions are congruent (Stimulus “Left”

is uttered from left location).

2.4.1.1. Reaction Time and Initial Rotation Direction

The Spatial Stroop effect was apparent in the Left and Right dimension of the

orienting movement in the Word condition. Reaction time for the congruent

condition (e.g., “Left” from the left location) of an orienting movement to the

Page 54: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 39 -

word instructions was significantly faster than in incongruent conditions (e.g.,

“Left” from the right location). In the Location condition, however, the spatial

semantic cues did not influence the RTs at all. In the Word condition, the

orienting movement for word instructions was influenced by the location

dimension, but the semantic stimulus dimension did not affect the Location

condition. The spatial Stroop effect therefore was asymmetrical, occurring only

when subjects were attempting to use only the semantic content of the stimulus

word, and only in some conditions. The reason why the spatial semantic cues did

not affect the Front and Back location is most likely that front-back reversal

errors occurred more frequently in the Location condition, causing RTs, to be

longer. In addition, the Front movement in this study was a nod, i.e. a different

movement from the other locations. This may have affected the RT for this

location only. RT for the front movement was significantly longer than the other

locations. This pattern of results suggests that auditory localisation may influence

semantic processing, but not the other way round.

Neutral stimuli had longer RTs than either congruent or incongruent stimuli, by

about 45 ms, when comparing just the left and right locations (which were not

affected by the unique case of the nodding movement). While the “Yes”

stimulus was not significantly longer, values for Tone were. This outcome is not

consistent with either clear-cut interference or facilitation effects, as the

incongruent values might be expected to be longer than neutral values. It may be

that the relatively low frequency of these neutral stimuli compared to semantic

spatial stimuli (1/6 of the trials in each case) caused participants to respond more

Page 55: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 40 -

slowly simply because they were unusual. A proper assessment of the neutral

stimuli would require that they are delivered with equal probability.

The initial rotation direction (IRD) for all stimuli in the Word condition, but most

clearly that for the “Back” stimulus, also showed a clear influence of location

processing (i.e. processing of the irrelevant stimulus dimension) very strongly.

The mean IRD expresses the direction which subjects tend to begin their

orienting movement - either clockwise or anti-clockwise at the start of the

movement. Subjects more often started to turn anti-clockwise when the stimulus

was presented from the Left location, and more often clockwise when it was

from the Right location. The subject’s initial direction seemed to be affected by

auditory localisation processing in the Word condition. A possible cause for these

results is that subjects cannot easily prevent themselves starting to respond to the

stimulus as a sound. If auditory localisation processing commences faster than

spatial semantic processing, and it would influence response selection process

strongly. Therefore, the interference of the location content in the Word

condition affected RTs and IRD because they are in the initial part of the

orienting movement. As the other possible cause of interference, an orienting

movement can be assumed to be a natural consequence of sound localisation,

namely, there is an innate link between the auditory system and orienting to the

source of a sound regardless of the task (e.g., the Location or Word condition).

2.4.1.2. Rotation Direction at Peak Velocity, Movement Time and Constant Error

There is no difference between the congruent and incongruent condition in

Experiment One for any of these measures. In addition, no difference between

Page 56: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 41 -

the congruent, incongruent, neutral and physical tone was found in the Location

condition. Both auditory localisation and spatial semantic processing would

generally be complete before the end of response selection and the subsequent

orienting movement. Previous studies that investigated the Simon and Stroop

effects used key-pressing or joystick estimation as the response. These responses

do not include adjustment of the final location, pressing a correct key or making

a discrete movement of joystick does not include continuous feedback processing

or corrections. The Simon and Stroop-like effects would not affect the whole

response selection but simply the first part of the process. Because the current

task involved a movement lasting approximately 900-1400ms, there was plenty

of opportunity for modification and correction of the movement. As compared

with IRD, which showed interference by sound location in the Word condition,

RDPV showed no such effect. This suggests that interference effects were

present only early on, and by the time of peak movement velocity, only the

relevant stimulus was being used.

The findings of this study contrast strongly with the one previous study which

used a similar experimental design (Palef & Nickerson, 1978). Palef &

Nickerson (1978) reported spatial Stroop effects in the Location but not the Word

condition, whereas in the current study, they were found in the Word condition

but not the Location condition. It was not clear what these different results were

caused by in the two studies. One possible cause is that the response to the

stimulus in the current study was an orienting movement (turning the whole

body). As previously stated, an orienting movement may be more natural and

have a faster RT than would a keypressing response. The keypressing would be

Page 57: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 42 -

slightly more complicated for the response task because the subjects should have

to decide which button to press using more cognitive rules. The difference

between the present results and those of Palef and Nickerson may also be

understood by examining the time course of processing for the semantic and

location aspects of the stimuli. This is explored more fully in the General

Discussion. This current experiment provides new information about the time

course of the spatial Stroop effect. It was evident only in the RT and initial

rotation direction, and had disappeared by the time of peak rotation velocity. This

suggests that spatial Stroop effects can be corrected after just a few hundred

milliseconds.

2.4.2. Front Back Confusions

2.4.2.1. Reaction time

In this experiment, reaction time for left and right locations was significantly

faster than for front and back locations in the Location condition. There are well

accepted explanations for front back reversals from previous studies. First, a

large number of studies (Abel, Figueiredo, Consoli, Birt, & Papsin, 2002;

Begault & Wenzel, 1993; Best, Carlile, Jin, & van Schaik, 2005; Burke, Letsos,

& Butler, 1994; Middlebrooks, 1992; Middlebrooks & Green, 1991;

Middlebrooks, Makous, & Green, 1989; Paulus, 2003; Phinney & Nummedal,

1979) showed that front-back reversals and confusion were accounted for by the

“cone of confusion” which refer to a cone-shaped zone in which subjects often

mistake front locations for back, and vice versa. These studies suggested that the

cone of confusion is based on Interaural Time difference (ITD) and Interaural

Level Difference (ILD). As was described in Chapter 2, interaural differences

Page 58: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 43 -

are smallest when the sound comes from the midline (i.e. immediately in front or

behind - on the mid-sagittal axis). Wallace and Fisher (1998) indicated in their

sound localisation study that response time for keypresses increased when sounds

were presented from speakers symmetrically located in front of and behind the

participant. Moreover, they indicated that front-back confusion made response

times longer, regardless of whether the speakers were positioned on the medial or

lateral axis. These findings are very similar to those in the current experiment.

2.4.2.2. Constant Error and Reliability

The expectation of this study was that directional accuracy would be lower for

front and back than for left and right locations. A large number of front-back

reversal errors were found in this experiment. Constant Error was calculated by

correcting for front-back reversal errors. The reliability measure indicated high

levels of variability, however. A large range for Reliability was found in the

Location condition. Gilkey & Anderson (1995) reported front-back reversals in a

sound localisation task using spoken word stimuli. In their study, localisation

accuracy for front and back locations was lower than for Left and Right locations.

A most interesting outcome in the current study was that the rate of front-back

confusion was different from previous studies: 2% from front to back reversals

and 59% from back to front reversals. Begault & Wenzel, (1993) reported,

however, the percentage of reversals from front to back was significantly higher

than from back to front (47% v 11%), using speech stimuli from eight locations

with an oral response (and for midline stimuli only, front-back reversals (0

degree, 180 degree), front-back reversals: 58% and back-front reversals: 24%).

In addition, Wightman and Kistlaer (1989b) reported 29% front-back reversal

and 6% back-front reversals in the middle elevations. A possible reason for this

Page 59: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 44 -

difference is the method of presentation of the stimuli. They presented stimuli

over headphones based on filtering by head-related transfer functions (HRTFs),

methods which have recently been developed to synthesize a virtual sound

source. However, a difference between the studies based on HRTFs and free field

sound localisation studies may be shown by this current study. The method

should be considered in future studies for investigating front-back confusion.

However, it is also possible that the acoustic properties of the testing room in the

present study were asymmetric, and if sounds from behind echoed from the front

more often than the reverse, then this could explain the findings.

Page 60: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 45 -

Chapter 3 Experiment Two

3.1 Introduction

In order to obtain a separate estimate of the influence of semantic information, a

second experiment was undertaken. In previous studies, the incongruent

condition was very obviously incongruent, for example, the word “Left” was

presented from Right, Front and Back positions, similar to Experiment One in

this study. In such cases, there is a 90º or180º difference between the two

dimensions of the stimulus. It could be that increased RTs for an incongruent

condition (as was found for the Word condition in Experiment One, for left and

right stimuli) only occur when the incongruence is very obvious. If the

difference between the sound source location and the semantic spatial content is

much less clear-cut, and an advantage for congruence is still found, this would be

strong support for the idea that semantic content is obligatorily processed as part

of response selection. Therefore, this experiment examined performance when

the degree of incongruence was much smaller than in Experiment One.

3.1.1. Hypothesis

The auditory localisation process uses semantic spatial localisation cues in

addition to auditory localisation cues.

Two predictions of this hypothesis that were tested are:

1. Reaction time will be longer when the sound source is rotated from the

front, back, left and right positions by 20 degrees to the left (anti-

clockwise, from above).

2. Directional estimates will be biased towards the position indicated by

the semantic cues under rotated conditions than non-rotated conditions.

Page 61: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 46 -

3.2 Methods

Apparatus and stimuli were identical with, and the general procedure was similar

to Experiment 1. However, a new group of participants aged 18-23 (5 women, 5

men) was recruited.

3.2.1. Experimental environment

Participants performed ten repetitions in the same task as in the first experiment.

The major differences were that:

• Loudspeakers were placed in four positions as in Experiment 1, but

there were two conditions: non-rotated and rotated (anticlockwise by

20 degree, Figure 3.1).

• Only the four congruent stimuli (Front, Back, Left and Right) were

used. (Note: although these are described as ‘congruent’ in this case,

in the sense that, for example, a “Front” word stimulus was always

presented from a position closer to the actual front than to the left,

right or back positions, there is still a form of incongruence when the

word meaning denotes a position 20º rotated from the location of the

sound source).

3.2.2. Design

The experiment had a similar design to Experiment 1, however, the form was a 2

(rotation) × 4 (position) design. The independent variables are rotation (absolute

position and rotated position), and directional position (Front, Back, Left and

Right). Dependent variables are the same as Experiment 1.

Page 62: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 47 -

Figure 3.1 Layout of apparatus for Experiment Two. Speakers were positioned 20 degrees anti-clockwise in the Rotated condition.

3.2.3. Procedure

The procedure of Experiment Two was identical to Experiment One. The rotated

and non-rotated conditions were randomised within the separate Location and

Word conditions which were counterbalanced.

3.2.4. Analysis and statistics

The same procedures for identifying RT and directional error were used as in

Experiment 1. Since there were no tone or “Yes” stimuli, all analyses used a 2 ×

4 × 4 repeated measure ANOVA, with data averaged over repetitions. Planned

comparison analyses were performed for specific pairwise comparisons, and

Fisher’s post-hoc test was used to further examine significant effects from the

ANOVAs.

Speaker

Reference Position

Non-rotated Rotated

20º

Page 63: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 48 -

3.3. Results

3.3.1. Reaction Time

Reaction time for the Word condition (469±92ms ) was significantly shorter (by

146 ms) than the Location condition, 615±174ms, F(1,9)=16.76, p<.005 (Figure

3.2). Hypothesis 1a predicted that RTs would be longer under rotated than non-

rotated conditions. This hypothesis was not confirmed, however. In fact, for the

Front location in the Location condition, the reverse was true (planned

comparison F(1,9)=22.74, p<.005). There was hardly any effect of rotation in the

Word condition. In addition, the RTs for the stimuli ”Left” and “Right” were

significantly faster than for the stimuli “Front” and “Back” in both Location

F(1,9)=32.77, p<.0005 and the Word conditions F(1,9)= 28.09, p<.0005).

LOCATION CONDITION

STIM-LOC:Left

RightFront

Back200

300

400

500

600

700

800

900

1000

1100

Rea

ctio

n Ti

me

(ms)

WORD CONDITION

STIM-LOC:Left

RightFront

Back

Non-ROTATION ROTATION

Non-ROTATION ROTATION

Figure 3.2 Reaction Time for the four stimuli in the Word condition and for the four locations in the Location condition (STIM-LOC denotes stimulus and location, which are always congruent in Experiment Two.)

Page 64: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 49 -

3.3.2. Initial Rotation Direction

Initial rotation direction for the “Front” stimulus in the rotated condition was

significantly more often anticlockwise than for the non-rotated condition

(planned comparison, F(1,9)=29.45, p<.0005) and the “Back” stimulus in the

rotated condition was significantly more often clockwise than the non-rotated

condition in the Location condition, planned comparison F(1,9)=16.58, p<.005,

Figure 3.3. This effect was not found in the Word condition.

LOCATION CONDITION

STIM-LOC:LEFT

RIGHTFRONT

BACK-0.2

0.0

0.2

0.4

0.6

0.8

1.0

1.2

Initi

al R

otat

ion

Dire

ctio

n

WORD CONDITION

STIM-LOC:LEFT

RIGHTFRONT

BACK

Non-ROTATION ROTATION

Figure 3.3 Initial Rotation Direction for each stimulus for the Non-Rotated and Rotated conditions 3.3.3. Movement Time Figure 3.4 depicts movement time for each stimulus or location in each condition.

Movement time for the Back location in the Location condition was significantly

longer than the other locations, F(3, 27)=3.91, p<.05. In addition, Movement

time for the “Back” stimulus in the Word Condition was significantly longer than

the other stimuli, F(3, 27)=9.24, p<.0005.

Page 65: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 50 -

LOCATION CONDITION

STIM-LOC:LEFT

RIGHTFRONT

BACK600

800

1000

1200

1400

1600

1800

2000

Mov

emen

t Tim

e (m

s)

WORD CONDITION

STIM-LOC:LEFT

RIGHTFRONT

BACK

Non-ROTATION ROTATION

Non-ROTATION ROTATION

Figure 3.4 Movement Time for each stimulus or location in both conditions

3.3.4. Front-Back Reversals

Front back reversal errors were also found for the back location in Experiment

Two (Table 3.1), however the front back reversal errors were not as frequent as

in Experiment One.

3.3.5. Constant Error

As in Experiment One, front-back reversals were corrected using the method of

Oldfield & Parker (1986) for the analysis of Constant Error. Constant error for

the Rotation condition was significantly higher than for the Non-Rotation

condition, F(3, 27)=3.04, p<.05 (Figure 3.5).

Table 3.1 Frequency of front-back reversals in the Non-rotated and rotated condition (number and percentage of trial in each combination of location and stimulus). CONDITION Non-Rotation Rotation LOCATION Left Right Front Back Left Right Front Back Frequency 1 0 3 21 0 0 0 11 % 1 0 3 21 0 0 0 11

Page 66: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 51 -

LOCATION CONDITION

STIM-LOC:LEFT

RIGHTFRONT

BACK-30

-25

-20

-15

-10

-5

0

5

10

15

20

25

Con

stan

t Err

or (d

egre

e)

WORD CONDITON

STIM-LOC:LEFT

RIGHTFRONT

BACK

Non-ROTATION ROTATION

Non-ROTATION ROTATION

Figure 3.5 Constant Error for each stimulus in both the Location and Word condition. The main results for CE are shown in Figure 3.5, 3.6, 3.7. In the second two

figures, 3.6 and 3.7 results are shown in the form of arrows, pointing in the

direction of the average CE for each condition. The solid arrows in Figure 3.6

show that the non-rotated condition was reasonably accurate (although subjects

tended to turn about 7 degree clockwise from the stimulus in the Back condition).

When the rotated condition is compared to the non-rotated condition, it can be

seen that, for Front and Back locations, subjects rotated by close to the 20 degree

offset, in the appropriate direction. The rotation in the Left and Right conditions,

however, was much smaller (approximately 5 degrees). The interaction between

rotation and location was significant, F(1,9)=5.23, p<.05.

Page 67: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 52 -

Figure 3.6 Constant Error for Non-rotation and Rotation condition in the Location condition

Figure 3.7 Constant Error for Non-rotation and Rotation condition in the Word condition

Rotated Right

Non-Rotated Back

Non-Rotated Right

Non-Rotated

Left

Rotated Front

Rotated Left

Rotated Back

Non-Rotated Front

Non-Rotated Direction judgement

Rotated DirectionSD Non-Rotated

SD Rotated

Rotated Right

Rotated Front

Rotated Left

Rotated Back

Non-Rotated Front

Non-Rotated Right

Non-Rotated

Left

Non-Rotated Back

Non-Rotated Direction judgement

Rotated DirectionSD Non-Rotated

SD Rotated

Page 68: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 53 -

Figure 3.7 (Word Condition) shows that there was little if any effect of rotation.

Subjects pointed quite accurately to the position indicated by the stimulus, i.e. to

the Left (270 degree), Right (90 degree), Front (0 degree) and Back (180 degree).

3.3.5. Reliability

Although extreme differences in variance make a formal statistical comparison

less reliable, Reliability for the Location condition was significant lower than the

Word condition, F(1, 9)=9.51, p<.05. Moreover, Reliability for the Back location

was significantly lower than for the other locations in the Location condition,

(Fisherman LSD post-hoc analyses: for Left location p<.001, for Right location,

p<.001, for Front location p<.005), whereas Reliability was little changed in the

Word condition (Figure 3.8).

LEFT RIGHT FRONT BACKSTIMULUS-LOCATION

0.70

0.75

0.80

0.85

0.90

0.95

1.00

1.05

1.10

Rel

iabi

lity

Location cond Word cond

Figure 3.8 Reliability of Constant Error in the Location and Word condition

Page 69: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 54 -

Even when the statistical analysis is disregarded, the extremely consistent

performance in the Word condition, and the more variable performance in the

Location condition for the Back stimulus and location, are very apparent.

3.4. Discussion

In Experiment Two, the key comparison was between non-rotated and rotated

conditions. Unlike Experiment One, there were no cases in which the location

indicated by the stimulus word was completely incongruent with the sound

location. For example, the stimulus “Left” was presented from the non-rotated

location (straight ahead), and from the (rotated) location 20 degree anti-

clockwise. Likewise, the rotated “Left” was not exactly Left but still on the left

side as compared with the other locations. The rotated condition was not

absolutely incongruent, especially with respect to the direction of movement. In

this experiment, in neither the Word nor the Location condition was there any

evidence of a spatial Stroop effect on RT, IRD, RDPV, or MT. A possible reason

for this is that the difference between the positions indicated by the relevant and

the irrelevant dimensions was too small to cause interference. The locations

where the rotated stimuli were presented in both conditions were on the same

side as of the non-rotated condition, (except Front). At one level, the word’s

spatial meaning never really clashed with the actual sound location, while at a

more detailed level, the two locations were in fact slightly different. The two

earliest indications of performance, RT and IRD, might be expected to show

identical effects, but there were actually some differences between the two

measures. There was no difference in RT between the rotated and non-rotated

conditions, in either Word and Location conditions. On the other hand, for the

Page 70: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 55 -

Front and Back stimuli, there was an extremely clear effect of rotating the

stimulus anti-clockwise. Subjects tended to rotate anti-clockwise much more

often when the sound source was rotated. This shows they were influenced by the

actual location at an early stage. In the Word condition, however, the IRD did not

change. Subjects could ignore location when instructed to do so in this

experiment even for the initial direction, because the directions were not

incongruent.

Similarly, RTs for left and right locations were significantly longer than front

and back locations, and there were also some Front-back reversals in front and

back locations. In addition, this experiment’s data also supports a finding (Abel

& Paik, 2004) that rotated front and rotated back locations had significantly

longer reaction times than both rotated and non-rotated left and right locations.

Compared to these early variables, CE for the Location condition shows a

difference between the rotated and non-rotated conditions clearly. In all four

locations in the Location condition, subjects ended up facing in a more anti-

clockwise position in the rotated conditions. By contrast, they were completely

unaffected by rotation in the Word condition, however. In addition to this

rotation, the CE for rotated Left and Right locations in the Location condition

was biased towards the non-rotated location, i.e. subjects judged the location

towards the position indicated by the word meaning. However, previous sound

localisation studies reported the judgment of left and right locations was

estimated near the interaural axis. Begault & Wenzel, (1993) reported that five

subjects out of ten showed a biased response in which they tended to “pull”

Page 71: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 56 -

toward the interaural lateral axis. This pattern was also observed in a free-field

study (Oldfield & Parker, 1984). Therefore, it was not clear that in the current

study this bias towards the word location represented a Stroop interference effect

(i.e. an interference of the word meaning) or whether it was a simple bias

towards the interaural lateral axis that is independent of the stimulus properties.

Page 72: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 57 -

Chapter 4. GENERAL DISCUSSION The aim of this study was to investigate the effects of spatial semantic

information on sound localisation using orienting movements. In order to

examine this, it was hypothesised that reaction time (RTs) for, and constant error

(CE) of location judgement would be affected by spatial semantic information if

this information and the actual sound location are incongruent, in other words,

that a spatial Stroop effect would be present in these cases. In the actual

experiments, however, it was also possible to measure several additional

variables: Initial Rotation Direction (IDR), Rotation Direction at Peak Velocity

(RDPV), Movement Time (MT), and Reliability of constant error (R). In this

chapter, the discussion presents a possible explanation of the results for both the

original hypotheses and the additional variables. Finally, additional issues,

limitations of this study and possible future studies are mentioned.

4.1. Overview of Results

The main results of this study are summarised in Table 4.1 to assist the

discussion. This table indicates the main phenomenon under investigation, the

Spatial Stroop effect, as well as other phenomena such as Front-back reversal

and rotation effects. The table shows evidence for the Spatial Stroop effect in

Experiment One, but only for RT, and IRD, and only in the Word condition.

There is much weaker evidence for this effect in Experiment Two. In the two

experiments, there was evidence for Front-Back reversals in several variables.

The discussion will put forward an account of the overall pattern of results, and

will address the specific hypotheses where appropriate. As a tool to help explain

the results, a model of processing involved in the experimental tasks is presented

Page 73: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 58 -

next. This is shown in Figure 4.1. The model contains a simplified view of the

major processes, arranged on a timeline starting before the stimulus, and ending

when the movement has been completed. The model applies to both experiments.

Table 4.1 Effects for each variable. (S denotes Spatial Stoop Effect, FBR denotes Front-Back Reversal, S? denotes possibility of Spatial Stroop Effect; X denotes no effect. L & R indicates Left and Right stimulus location only. Condition RT IRD RDPV MT CE R EXPERIMENT 1 LOCATION FBR X X X X FBR

WORD

S

(L & R) S

(ALL) X X X X EXPERIMENT 2

LOCATION FBR X X X

S? (L & R) FBR

WORD X X X X X X

4.2. Auditory Information Processing

Figure 4.1 shows a framework of the various processes that are assumed to occur

in the task required in these experiments. Numbers indicate a time scale (ms),

with a word stimulus presented at 0 ms. Before a stimulus is presented, the

subject already has a response ‘set’, that is, the instruction for that condition is to

use one dimension of the stimulus (use the relevant) and ignore the other

(irrelevant) dimension, for example, use location, ignore meaning. Two types of

information processing are started when the stimulus reaches a subject’s ears.

Auditory localisation processing indicates recognition of the stimulus location

without any reference to the meaning of the stimulus word. This processing

would commence first as it depends only on the presence of a sound and not its

specific content. Spatial semantic processing would proceed slightly later,

Page 74: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 59 -

because it must decode the meaning of the word stimulus. This requires both a

longer time to sample the content of the sound, and a process to match the sound

with its memory representation. The output of these processes would then

influence the response selection stage, which would determine both the initial

movement and the final position. In all cases and both Experiments, the stimulus

lasted 500ms. In many cases, subjects had RTs less than 500ms.

Figure 4.1. Information processing model for orienting movement. (Time value is indicative only)

Time (ms)

0 1000 2000

Approximate Range for start of movement

Spatial Semantic Processing

Auditory Localisation Processing

Stimulus

Orienting Movement

RT

Instruction Response Selection

Approximate Range for end of movement

IRD MT RDPV CE

Page 75: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 60 -

This suggests that the auditory localisation and/or the semantic spatial processing

can be completed in much less than 500ms, because the response selection stage

would also require some time. However, because the orienting movement can

last up to a second or more, there is still time for these processes to continue and,

possibly, modify the ongoing movement. For example, an initial movement

could be in the wrong direction and the direction must be changed, or the final

position might be decided only during the movement. The boxes in Figure 4.1

represent these processes and are therefore shown as having an unknown and

variable duration. The end result of these hypothesized processes is the execution

of an orienting movement.

The time difference between auditory localisation processing and spatial

semantic processing would make it probable that auditory localisation could

influence the initial response selection for the orienting movement, while spatial

semantic processing would influence it later on. This is in contrast to the study of

Palef and Nickerson (1978), which indicated that word processing was

undertaken faster than location processing. In the current study however, the

result showed that location processing occurred faster than word processing.

In fact, when only the Left and Right locations of the Palef and Nickerson study

are examined, their results are not so different from the current study’s outcome.

The fact that their RTs are at least 140ms slower, on average, than those for the

comparable conditions in the current study’s Experiment One, however, lends

weight to the argument that their task was more complex and required a more

cognitively demanding form of response selection. Therefore, the apparent

differences between the two studies can be explained if, in the current study, the

Page 76: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 61 -

very direct and rapid form of response selection allowed the participant to begin

moving in response to the sound location before completing the processing of the

semantic content of the stimulus.

4.3. Limitations of this Study and Future Research

The present study used a whole body orienting movement as the method of

indicating the localisation judgment. However, there was no measure of gaze

position. It is possible that the head orientation and gaze position were not

completely aligned, and that slightly different outcomes would have been

obtained for actual line of sight. In addition, subjects did not receive any visual

feedback about their head orientation. Lewald, Dorrscheidt, & Ehrenstein (2000)

indicated sound localisation was more accurate when a visual indication of the

actual midline of the head was given to subjects, using a reference produced by a

laser pointer during the orienting movement. The judgment of orienting

movements may be more accurate if visual feedback for the judgment is

presented. In addition, nodding was a special case and not a natural response for

the front location. Nodding may take more time than the other movements and be

initiated more slowly.

In order to demonstrate whether spatial semantic information affected the

orienting movement in the Word condition, future experiments could use slightly

more rotated speaker positions. As described in the discussion of Experiment

Two, the relevant and the irrelevant condition (i.e. Non-rotated and Rotated

condition) may have been too small for interference from the spatial Stroop

effect. Therefore the rotated speakers could be positioned at 45 degrees, that is,

for example, in between the Left and Front speakers. When the “Left” stimulus is

Page 77: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 62 -

presented from the speaker in the both Location and Word condition, the future

study hypothesises that the orienting movement will be affected by spatial

semantic information.

4.4. Conclusions

The present study found that interference caused by the spatial Stroop effect was

evident in the RT and IRD for Left and Right locations in Location condition of

Experiment One. However, this was not true for the Front and Back locations. It

was not found in later variables, and not at all in Experiment Two. These

findings allow three conclusions:

1. Spatial Stroop effect is asymmetric. The sound’s actual location can affect the

use of semantic cues, but not the reverse.

2. The spatial Stroop effect is confined to the planning and early execution of the

movement, as it is seen in only RT and initial movement direction. This provides

new information about the time course of the spatial Stroop effect.

3. These spatial Stroop effect will not occur if the location indicated by the

semantic cue and the actual location cues differ by up to twenty degrees, and are

therefore not clearly incongruent. This is true even though this rotation can

clearly be detected, evidenced by the altered final positions in Experiment Two’s

Location condition.

Another expected result was, however, confirmed, i.e. front-back reversals

occurred very frequently. More than half of the trials in the back condition were

judged coming from the front. No obvious explanation could be found for the

fact that in this study, unlike others, more back-front than front-back errors were

made.

Page 78: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 63 -

References

Abel, S. M., Figueiredo, J. C., Consoli, A., Birt, C. M., & Papsin, B. C. (2002). The effect of blindness on horizontal plane sound source identification. International Journal of Audiology, 41(5), 285-292.

Abel, S. M., & Paik, J. E. S. (2004). The benefit of practice for sound localization without sight. Applied Acoustics, 65(3), 229-241.

Abernethy, B. (1991). Visual search strategies and decision-making in sports. Journal of Sports Psychology, 22, 189-210.

Begault, D. R. (1994). 3-D sound for virtual reality and multimedia. London: Academic Press Limietd.

Begault, D. R., & Wenzel, E. M. (1993). Headphone localization of speech. Human Factors, 35(2), 361-376.

Best, V., Carlile, S., Jin, C., & van Schaik, A. (2005). The role of high frequencies in speech localization. The Journal of The Acoustical Society of America, 118(1), 353-363.

Blauert, J. (1983). Spatial hearing. The Psychophysics of Human Sound Localization. Cambridge, MA: MIT Press.

Burke, K. A., Letsos, A., & Butler, R. A. (1994). Asymmetric performances in binaural localization of sound in space. Neuropsychologia, 32(11), 1409-1417.

Daus, A. T., Wilson, J., & Freeman, W. M. (1989). Predicting success in football. Journal of Sports Medicine and Physical Fitness, 29(2), 209-212.

Durlach, N. I., & Colburn, H. S. (1978). Binaural Phenomena. New York: Academic Press.

Gardner, J. D., & Sherman, A. (1995). Vision requirements in sport. Oxford: Butterworth-Heinemann.

Geierlich, H. W. (1992). The application of Binaural Technology. Applied Acoustics, 36, pp219-243.

Genuit, K. (1984). A model for the description of outer-ear transmission characteristics. Ph.D. Dissertation(Rhenish Westphalian Technical University).

Gilkey, R. H., & Anderson, T. R. (1995). The accuracy of absolute localization judgments for speech stimuli. Journal of Vestibular Research: Equilibrium & Orientation, 5(6), 487-497.

Helsen, W. F., & Sherman, A. (1999). A multidimentional apporch to skilled perception and performance in sports. Applied Cognitive Psychology, 13, 1-27.

Henning, G. B. (1980). Some obsernations on the lateralization of complex waveforms. The journal of the Acoustical Society of America., 68((2)), pp446-454.

Klatzky, R. L., Lippa, Y., Loomis, J. M., & Golledge, R. G. (2002). Learning directions of objects specified by vision, spatial audition, or auditory spatial language. Journal of Experimental Psychology. Learning, Memory, and Cognition, 9(6), 364-367.

Klatzky, R. L., Lippa, Y., Loomis, J. M., & Golledge, R. G. (2003). Encoding, learning, and spatial updating of multiple object locations specified by 3-D sound, spatial language, and vision. Experimental Brain Research, 149(1), 48-61.

Page 79: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 64 -

Klump, R. B., & Eady, H. R. (1956). Some measurements of interaural time difference threshold. Journal of the Acoustical Society of America, 28, pp859-860.

Knudsen, E. I., Hasbroucq, T., & Osman, A. (1982). Visual instruction of the neural map of auditory space in the developing Optic tectum. Science, 253, pp85-87.

Kornblum, S., Hasbroucq, T., & Osman, A. (1990). Dimensional overlap: cognitive basis for stimulus-response compatibility--a model and taxonomy. Clinical Psychology Review, 97(2), 253-270.

Lewald, J., Dorrscheidt, G. J., & Ehrenstein, W. H. (2000). Sound localization with eccentric head position. Behavioural Brain Research, 108(2), 105-125.

Loomis, J. M., Lippa, Y., Golledge, R. G., & Klatzky, R. L. (2002). Spatial updating of locations specified by 3-d sound and spatial language. Journal of Experimental Psychology. Learning, Memory, and Cognition, 28(2), 335-345.

Lu, C. H., & Proctor, R. W. (1995). The influence of irrelevant location information on performance: A review of the Simon and spatial Stroop effects. Psychonomic Bulletin & Review, Vol 2(2), pp174-207.

McMorris, T. (1997). Performance of soccer players on tests of field dependence/independence and soccer-specific decision-making tests. Perceptual And Motor Skills, 85(2), 467-476.

McNamara, T. P. (2005). Semantic priming: perspectives from memory and word recognition New York: Psychology Press

Middlebrooks, J. C. (1992). Narrow-band sound localization related to external ear acoustics. Journal of The Acoustical Society of America, 92(5), 2607-2624.

Middlebrooks, J. C., & Green, D. M. (1991). Sound localization by human listeners. Annual Review of Psychology, 42, 135-159.

Middlebrooks, J. C., Makous, J. C., & Green, D. M. (1989). Directional sensitivity of sound-pressure levels in the human ear canal. Journal of the Acoustical Society of America 86(1), 89-108.

Muller, B. S., & Bovet, P. (2002). Performance and reaction times in monaural localization of first names in the horizontal plane. Brain And Language, Vol. 82(1), pp. 1-9.

Muller, H. M., & Kutas, M. (1996). What's in a name? Electrophysiological differences between spoken nouns, proper names and one's own name. Neuroreport, 8(1), 221-225.

Nakagawa, A. (1985). Present status and perspective of the study on decision making in ball games. Japanese Journal of Physical Education JAPON, 30(2), pp105-115.

Neely, J.H., (1991). Semantic priming effects in visual word recognition: A selective review of current findings and theories. In D. Besner & G. W. Humphreys (Eds.), Basic processes in reading: Visual word recognition (pp.264-336). Hillsdale, NJ: Erlbaum.

Nougier, V., & Rossi, B. (1999). The development of expertise in the orienting of attention. International Journal of Sport Psychology, Vol 30(2), pp. 246-260.

Page 80: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 65 -

Oldfield, S. R., & Parker, S. P. (1984). Acuity of sound localisation: a topography of auditory space. I. Normal hearing conditions. Perception, 13(5), 581-600.

Oldfield, S. R., & Parker, S. P. (1986). Acuity of sound localisation: a topography of auditory space. III. Monaural hearing conditions. Perception, 15(1), 67-81.

Palef, S. R., & Nickerson, R. B. (1978). representing auditoy space. Perception & Psychophysics, vol.23(5), 445-450.

Paulus, E. (2003). [Sound localization cues of binaural hearing]. Laryngorhinootologie, 82(4), 240-248.

Phinney, J. S., & Nummedal, S. G. (1979). Effects of left-right orientation and position reversals on spatial perspective taking in young children. Perceptual And Motor Skills, 48(1), 223-227.

Seymour, P. H. (1974). Stroop interference with response, comparison, and encoding stages in a sentence-picture comparison task. Memory & Cognition, Vol. 2((1-A), pp. 19-26.

Shor, R. E. (1970). The processing of conceptual information on spatial directions from pictorial and linguistic symbols. Acta Psychologica, Amsterdam, Vol. 32(4), pp. 346-365.

Simon, J. R. (1990). The effects of an irrelevant directional cue on human information processing. (Vol. 65). Oxford, England: North-Holland.

Simon, J. R., Craft, J. L., & Webster, J. B. (1973). Reactions toward the stimulus source: analysis of correct responses and errors over a five-day period. Journal of Experimental Psychology, 101(1), 175-178.

Simon, J. R., & Small, A. M., Jr. (1969). Processing auditory information: interference from an irrelevant cue. Journal of Applied Psychology, 53(5), 433-435.

Starkes, J. L. (1987). Skill in field hockey: The nature of the cognitive advantage. Journal of Sport Psychology, Vol 9(2), pp. 146-160.

Stroop, J. R. (1992). Studies of interference in serial verbal reactions. Journal of Experimental Psychology, Vol 121(1), pp. 15-23.

White, B. W. (1969). Interference in identifying attributes and attribute names. Perception & Psychophysics, 6(3), pp. 166-168.

Wightman, F. L., & Kistler, D. J. (1989a). Headphone simulation of free-field listening. I: Stimulus synthesis. Journal of the Acoustical Society of America, 85(2), 858-867.

Wightman, F. L., & Kistler, D. J. (1989b). Headphone simulation of free-field listening. II: Psychophysical validation. Journal of the Acoustical Society of America, 85(2), 868-878.

Williams, A. M., Davids, K., Burwitz, L., & Williams, J. G. (1994). Visual search strategies in experienced and inexperienced soccer players. Research Quarterly for Exercise and Sport, 65(2), 127-135.

Williams, A. M., & Grant, A. (1999). Training perception skill in sport. International Journal of Sport Psychology, 30, 194-220.

Zwislocki, J. J., & Feldman, R. S. (1956). Just noticeable differences for intensity and their relation to loudness. Journal of the Acoustical Society of America, 28, pp. 860-864.

Page 81: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 66 -

Appendices

Page 82: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 67 -

Appendix A: Informed Consent Form and Participant

Information Packages

Page 83: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 68 -

Participant Information Sheet

“Auditory Localisation: Contributions of Sound Localisation and Semantic Spatial Cues”

Norikazu Yao

Description The purpose of this project is to investigate the effect on auditory localisation of spatial and non-spatial semantic information. The research team requests your assistance in undertaking a experiment about auditory localisation task. Your participation will involve a short screening test for hearing to make sure you have no subtle hearing impairment. Expected benefits It is expected that this project will not benefit you directly. However, it will improve our understanding of human auditory information processing and spatial orientation. Risks There are no risks associated with your participation in this project. Confidentiality All data you provide will be anonymous and will be treated confidentially. Your data will be analysed using a code and you will not be identified individually. Voluntary participation Your participation in this project is voluntary. If you do agree to participate, you can withdraw from participation at any time during the project without comment or penalty. Your decision to participate will in no way impact upon your current or future relationship with QUT. Questions / further information Please contact the researchers if you require further information about the project, or to have any questions answered. Concerns / complaints Please contact the Research Ethics Officer on 3864 2340 or [email protected] if you have any concerns or complaints about the ethical conduct of the project.

Page 84: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 69 -

Participant Information Sheet

“Auditory Localisation: Contributions of Sound Localisation and

Semantic Spatial Cues”

Norikazu Yao

Statement of consent By signing below, you are indicating that you: • have read and understood the information sheet about this project; • have had any questions answered to your satisfaction; • understand that if you have any additional questions you can contact the

research team; • understand that you are free to withdraw at any time, without comment or

penalty; • understand that you can contact the research team if you have any questions

about the project, or the Research Ethics Officer on 3864 2340 or [email protected] if you have concerns about the ethical conduct of the project;

• agree to participate in the project. Name

Signature

Date / /

Page 85: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 70 -

Appendix B: Statistical Analyses (6 stimuli × 4 locations) in Experiment One

Page 86: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 71 -

Table 1. ANOVA table for RT in the Location condition (6 Stimuli × 4 Location)

SS

Degree of

Freedom MS F p Intercept 91706261 1 91706261 378.2202 0.000000 Error 2182211 9 242468 Stimulus 34028 5 6806 0.4490 0.811775 Error 682151 45 15159 Location 6028682 3 2009561 28.0943 0.000000 Error 1931289 27 71529 Stim v Loc 170560 15 11371 0.7318 0.748822 Error 2097652 135 15538

Table 2 Initial Rotation Direction for each stimulus in Location condition (6 Stimuli × 4 Location)

SS

Degree of

Freedom MS F p Intercept 53.49963 1 53.49963 899.2449 0.000000 Error 0.53545 9 0.05949 Stimulus 0.08875 5 0.01775 0.5910 0.706827 Error 1.35157 45 0.03003 Location 30.36618 3 10.12206 102.5998 0.000000 Error 2.66370 27 0.09866 Stim v Loc 0.26739 15 0.01783 0.6044 0.867373 Error 3.98184 135 0.02950

Table 3 Rotation Direction at Peak Velocity for each stimulus in Location condition (6 Stimuli × 4 Location)

SS

Degree of

Freedom MS F p Intercept 53.49963 1 53.49963 899.2449 0.000000 Error 0.53545 9 0.05949 Stimulus 0.08875 5 0.01775 0.5910 0.706827 Error 1.35157 45 0.03003 Location 30.36618 3 10.12206 102.5998 0.000000 Error 2.66370 27 0.09866 Stim v Loc 0.26739 15 0.01783 0.6044 0.867373 Error 3.98184 135 0.02950

Page 87: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 72 -

Table 4 Movement Time for each stimulus in Location condition (6 Stimuli × 4 Location)

SS

Degree of

Freedom MS F p Intercept 275828531 1 275828531 137.6831 0.000001 Error 18030228 9 2003359 Stimulus 118523 5 23705 0.6518 0.661592 Error 1636503 45 36367 Location 1795355 3 598452 2.3822 0.091511 Error 6782959 27 251221 Stim v Loc 319190 15 21279 0.6490 0.829312 Error 4426094 135 32786

Table 5 Constant Error for each stimulus in Location condition (6 Stimuli × 4 Location)

SS

Degree of

Freedom MS F p Intercept 4280.996 1 4280.996 7.757701 0.021217 Error 4966.544 9 551.838 Stimulus 52.766 5 10.553 0.615614 0.688452 Error 771.413 45 17.143 Location 834.265 3 278.088 1.793036 0.172284 Error 4187.524 27 155.093 Stim v Loc 534.700 15 35.647 1.497460 0.114384 Error 3213.643 135 23.805

Table 6 Reliability for each stimulus in Location condition (6 Stimuli × 4 Location)

SS

Degree of

Freedom MS F p Intercept 197.2835 1 197.2835 1673.207 0.000000 Error 1.0612 9 0.1179 Stimulus 0.0438 5 0.0088 0.378 0.860851 Error 1.0419 45 0.0232 Location 2.6327 3 0.8776 13.654 0.000013 Error 1.7353 27 0.0643 Stim v Loc 0.2151 15 0.0143 0.726 0.755215 Error 2.6682 135 0.0198

Page 88: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 73 -

Appendix C: Means and Standard Deviation Tables in Experiment One

Page 89: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 74 -

Table 1. Reaction time for each stimulus in Location condition STIMULUS "Left" "Right" LOCATION Left Right Front Back Left Right Front BackMEAN 434.74 431.79 831.53 743.69 431.44 418.50 817.88 769.18SD 30.42 30.69 75.87 92.63 27.33 28.69 87.40 92.26 STIMULUS "Front" "Back" LOCATION Left Right Front Back Left Right Front BackMEAN 490.49 474.80 821.62 781.88 439.33 506.36 794.64 721.87SD 49.38 43.56 71.34 69.50 33.96 55.52 51.83 42.08 STIMULUS "yes" Tone LOCATION Left Right Front Back Left Right Front BackMEAN 443.92 502.21 761.06 725.40 462.31 489.03 744.50 797.49SD 24.47 59.94 88.97 45.62 26.94 37.80 42.44 39.70

Table 2 Reaction Time for each stimulus in Location and Word condition CONDITION Location Word STIMULUS "Left" "Right" "Front" "Back" "Left" "Right" "Front" "Back"MEAN 610.43 609.25 642.20 615.55 469.18 487.18 695.05 505.69SD 73.48 92.48 70.38 60.01 57.28 72.17 85.93 77.36

Table 3. Reaction Time for each location in Location and Word condition CONDITION Location Word LOCATION Left Right Front Back Left Right Front BackMEAN 449.00 457.86 816.42 754.15 549.75 523.98 548.81 534.55SD 57.50 70.52 123.53 126.19 63.89 79.17 64.07 68.64

Table 4. Reaction Time for congruent and incongruent dimension CONDITION Location Word STIMULUS "Left" "Right" "Left" "Right" LOCATION Left Right Left Right Left Right Left RightMEAN 434.74 431.79 431.44 418.50 442.30 486.51 538.26 430.77SD 30.42 30.69 27.33 28.69 30.99 43.20 45.96 35.27

Page 90: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 75 -

Table 5. Initial Rotation Direction in Location and Word conditions CONDITION Location STIMULUS "Left" "Right" LOCATION Left Right Front Back Left Right Front BackMEAN 0.00 0.98 0.59 0.42 0.02 0.93 0.67 0.46SD 0.00 0.02 0.06 0.10 0.02 0.05 0.10 0.10 CONDITION Location STIMULUS "Front" "Back" LOCATION Left Right Front Back Left Right Front BackMEAN 0.06 0.94 0.70 0.41 0.00 0.94 0.67 0.54SD 0.03 0.03 0.08 0.09 0.00 0.04 0.09 0.10 CONDITION Word STIMULUS "Left" "Right" LOCATION Left Right Front Back Left Right Front BackMEAN 0.00 0.12 0.04 0.04 0.78 1.00 0.98 0.94SD 0.00 0.08 0.03 0.03 0.08 0.00 0.02 0.04 CONDITION Word STIMULUS "Front" "Back" LOCATION Left Right Front Back Left Right Front BackMEAN 0.54 0.64 0.56 0.60 0.46 0.70 0.66 0.50SD 0.10 0.08 0.11 0.13 0.08 0.10 0.11 0.11

Table 6. Rotation Direction at Peak Velocity in Location and Word condition CONDITION Location STIMULUS "Left" "Right" LOCATION Left Right Front Back Left Right Front BackMEAN 0.00 1.00 0.63 0.42 0.00 0.98 0.51 0.32SD 0.00 0.00 0.09 0.07 0.00 0.03 0.09 0.10 CONDITION Location STIMULUS "Front" "Back" LOCATION Left Right Front Back Left Right Front BackMEAN 0.00 0.98 0.60 0.27 0.00 0.98 0.49 0.42SD 0.00 0.02 0.08 0.05 0.00 0.02 0.08 0.08 CONDITION Word STIMULUS "Left" "Right" LOCATION Left Right Front Back Left Right Front BackMEAN 0.00 0.00 0.00 0.00 0.98 1.00 1.00 1.00SD 0.00 0.00 0.00 0.00 0.02 0.00 0.00 0.00 CONDITION Word STIMULUS "Front" "Back" LOCATION Left Right Front Back Left Right Front BackMEAN 0.52 0.58 0.54 0.50 0.50 0.62 0.64 0.44SD 0.09 0.07 0.10 0.07 0.11 0.12 0.11 0.10

Page 91: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 76 -

Table 7. Movement Time for each stimulus in the Word and Location condition CONDITION Location Word STIMULUS "Left" "Right" "Front" "Back" "Left" "Right" "Front" "Back"MEAN 1115.94 1062.08 1051.51 1080.53 935.42 944.81 1012.52 1342.64SD 215.24 178.32 178.01 210.95 111.25 76.18 165.72 135.18

Table 8. Movement Time for each location in the Location condition CONDITION Location Word LOCATION Left Right Front Back Left Right Front BackMEAN 992.23 1023.39 1082.43 1212.01 1038.43 1078.30 1050.91 1067.74SD 169.26 124.79 241.75 306.80 112.29 105.04 91.40 101.81

Table 9. Constant Error for each stimuli in Location and Word condition CONDITION Location STIMULUS "Left" "Right" LOCATION Left Right Front Back Left Right Front BackMEAN 6.45 4.60 1.28 4.85 3.21 7.09 1.20 3.80SD 3.29 2.75 0.84 2.16 2.81 2.47 1.00 2.16 STIMULUS "Front" "Back" LOCATION Left Right Front Back Left Right Front BackMEAN 7.35 5.11 1.71 5.98 4.61 6.78 0.94 1.82SD 3.00 2.65 1.14 3.07 3.53 2.26 0.88 1.18

CONDITION Word STIMULUS "Left" "Right" LOCATION Left Right Front Back Left Right Front BackMEAN -0.37 -0.03 -0.26 0.02 2.60 2.14 6.35 5.82SD 0.57 0.89 1.44 1.09 0.79 1.28 0.80 0.96 STIMULUS "Front" "Back" LOCATION Left Right Front Back Left Right Front BackMEAN 0.08 0.42 0.14 0.30 -0.32 0.52 2.69 1.67SD 0.51 0.47 1.07 1.08 0.73 0.41 0.80 1.39

Table 10. Reliability for each location in Location and Word condition CONDITION Location Word LOCATION Left Right Front Back Left Right Front BackMEAN 0.99 0.96 0.92 0.73 0.98 0.99 1.00 0.97SD 0.00 0.04 0.09 0.14 0.04 0.01 0.00 0.03

Page 92: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 77 -

Appendix D: Statistical Analyses (2 condition × 4 stimuli × 4 locations) in Experiment One

Page 93: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 78 -

Table 1. ANOVA table of Reaction time for each stimulus in Location condition (6 stimuli: “Left”, “Right”, “Front”, “Back”, “Yes” and Tone x 4 locations: Left, Right, Front, Back) SS

Degree of Freedom MS F p

Intercept 91706261 1 91706261 378.2202 0.000000 Error 2182211 9 242468 Stimulus 34028 5 6806 0.4490 0.811775 Error 682151 45 15159 Location 6028682 3 2009561 28.0943 0.000000 Error 1931289 27 71529 Stim v Loc 170560 15 11371 0.7318 0.748822 Error 2097652 135 15538

Table 2. ANOVA table for Reaction Time (2 condition x 4 stimuli x 4 locations) SS

Degree of Freedom MS F p

Intercept 107393859 1 107393859 405.6456 0.000000Error 2382732 9 264748 Condition 513074 1 513074 5.3252 0.046415Error 867134 9 96348 Stimulus 868283 3 289428 15.9724 0.000004Error 489254 27 18121 Location 2328284 3 776095 18.3883 0.000001Error 1139557 27 42206 Cond v Stim 481255 3 160418 17.0676 0.000002Error 253772 27 9399 Cond v Loc 2174199 3 724733 22.0565 0.000000Error 887167 27 32858 Stim v Loc 112943 9 12549 1.1134 0.363012Error 913000 81 11272 Cond v Stim v Loc 44745 9 4972 0.6262 0.771502Error 643130 81 7940

Table 3. Planned comparison table for Reaction Time with congruent and incongruent dimension (Left and Right only)

Sum of

Squares Degree of Freedom

Mean Square F p

M1 57528.43 1 57528.43 9.811219 0.012077 Error 52771.82 9 5863.54

Table 4. Tukey HSD test for Reaction time of each location in Location condition. Error: Within MS = 70049., df = 27.000 Location Left Right Front Back Left (449.00) 0.998839 0.000171 0.000261Right(457.86) 0.998839 0.000173 0.000311Front(816.42) 0.000171 0.000173 0.720862Back(754.15) 0.000261 0.000311 0.720862

Page 94: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 79 -

Table 5. ANOVA table for Initial Rotation Direction (2 condition x 4 stimuli x 4 locations)

SS

Degree of

Freedom MS F p Intercept 89.04552 1 89.04552 636.4054 0.000000 Error 1.25928 9 0.13992 Condition 0.01648 1 0.01648 0.1032 0.755360 Error 1.43754 9 0.15973 Stimulus 8.29356 3 2.76452 29.7734 0.000000 Error 2.50701 27 0.09285 Location 12.76931 3 4.25644 68.8159 0.000000 Error 1.67002 27 0.06185 Cond v Stim 7.35946 3 2.45315 23.8014 0.000000 Error 2.78282 27 0.10307 Cond v Loc 6.05480 3 2.01827 30.3380 0.000000 Error 1.79620 27 0.06653 Stim v Loc 0.16138 9 0.01793 0.6724 0.731484 Error 2.16004 81 0.02667 Cond v Stim v Loc 0.25373 9 0.02819 1.0490 0.409311 Error 2.17687 81 0.02687

Table 6. Planned comparison table for IRD with congruent and incongruent dimension (Left and Right only)

Sum of

Squares Degree of Freedom

Mean Square F p

M1 0.250880 1 0.250880 8.945092 0.015180 Error 0.252420 9 0.028047

Table 7. ANOVA table for Rotation Direction at Peak Velocity (2 condition x 4 stimuli x 4 locations)

SS

Degree of

Freedom MS F p Intercept 79.07602 1 79.07602 1011.311 0.000000 Error 0.70372 9 0.07819 Condition 0.16411 1 0.16411 1.676 0.227719 Error 0.88139 9 0.09793 Stimulus 8.70127 3 2.90042 49.557 0.000000 Error 1.58023 27 0.05853 Location 11.40221 3 3.80074 72.246 0.000000 Error 1.42043 27 0.05261 Cond v Stim 11.19891 3 3.73297 42.331 0.000000 Error 2.38099 27 0.08818 Cond v Loc 8.88887 3 2.96296 59.355 0.000000 Error 1.34782 27 0.04992 Stim v Loc 0.10104 9 0.01123 0.574 0.814471 Error 1.58372 81 0.01955 Cond v Stim v Loc 0.31580 9 0.03509 1.472 0.172437 Error 1.93078 81 0.02384

Page 95: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 80 -

Table 8. ANOVA table for Movement Time (2 condition x 4 stimuli x 4 locations)

SS

Degree of

Freedom MS F p Intercept 365122012 1 365122012 281.5723 0.000000 Error 11670532 9 1296726 Condition 27874 1 27874 0.0593 0.813122 Error 4233033 9 470337 Stimulus 2229545 3 743182 10.1417 0.000120 Error 1978556 27 73280 Location 658905 3 219635 2.3773 0.091982 Error 2494448 27 92387 Cond v Stim 2303368 3 767789 11.7861 0.000041 Error 1758871 27 65143 Cond v Loc 511240 3 170413 1.4720 0.244336 Error 3125831 27 115772 Stim v Loc 155473 9 17275 0.4740 0.888050 Error 2951921 81 36443 Cond v Stim v Loc 810634 9 90070 3.2228 0.002182 Error 2263783 81 27948

Table 9. Tukey HSD test for Movement Time in each stimulus in the Word condition. Error: Within MS = 1031E2, df = 27.000 Stimulus "Left" "Right" "Front" "Back" "Left" (935.42) 0.999249 0.708181 0.000187"Right"(944.89) 0.999249 0.782286 0.000197"Front"(1012.5) 0.708181 0.782286 0.000623"Back"(1342.6) 0.000187 0.000197 0.000623

Table 10. ANOVA table for Constant Error (2 condition x 4 stimuli x 4 locations)

SS

Degree of

Freedom MS F p Intercept 2449.173 1 2449.173 12.94016 0.005773 Error 1703.422 9 189.269 Condition 633.249 1 633.249 2.98561 0.118075 Error 1908.901 9 212.100 Stimulus 182.273 3 60.758 6.86407 0.001389 Error 238.992 27 8.852 Location 115.171 3 38.390 0.59740 0.622207 Error 1735.079 27 64.262 Cond v Stim 343.137 3 114.379 7.21948 0.001042 Error 427.764 27 15.843 Cond v Loc 486.158 3 162.053 3.22872 0.038011 Error 1355.156 27 50.191 Stim v Loc 136.318 9 15.146 1.15720 0.333590 Error 1060.199 81 13.089 Cond v Stim v Loc 163.507 9 18.167 1.55901 0.141830 Error 943.911 81 11.653

Page 96: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 81 -

Table 11. ANOVA table for Reliability (2 condition x 4 stimuli x 4 locations)

SS

Degree of

Freedom MS F p Intercept 284.0894 1 284.0894 4738.347 0.000000 Error 0.5396 9 0.0600 Condition 0.5937 1 0.5937 7.923 0.020221 Error 0.6744 9 0.0749 Stimulus 0.0411 3 0.0137 0.782 0.514184 Error 0.4724 27 0.0175 Location 0.9484 3 0.3161 10.964 0.000069 Error 0.7785 27 0.0288 Cond v Stim 0.0447 3 0.0149 0.741 0.536769 Error 0.5426 27 0.0201 Cond v Loc 0.7733 3 0.2578 7.408 0.000897 Error 0.9395 27 0.0348 Stim v Loc 0.0459 9 0.0051 0.375 0.943957 Error 1.1012 81 0.0136 Cond v Stim v Loc 0.1648 9 0.0183 1.316 0.241536 Error 1.1271 81 0.0139

Page 97: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 82 -

Appendix E: Means and Standard Deviation Tables for Yes and Tone Stimuli in Experiment One

Page 98: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 83 -

Table 1. Reaction Time for "Yes" and Tone stimuli in the Location condition STIMULUS "Yes" Tone LOCATION Left Right Front Back Left Right Front Backmean 443.92 502.21 761.06 725.40 462.31 489.03 744.50 797.49SD 24.47 59.94 88.97 45.62 26.94 37.80 42.44 39.70

Table 2. Initial Rotation Direction for "Yes" and Tone Stimuli in the Location condition STIMULUS "Yes" Tone LOCATION Left Right Front Back Left Right Front Backmean 0.00 0.96 0.75 0.42 0.04 0.96 0.66 0.30SD 0.00 0.03 0.09 0.11 0.03 0.03 0.10 0.10

Table 3. Rotated Direction Peak Velocity for "Yes" and Tone stimuli in the Location condition STIMULUS "Yes" Tone LOCATION Left Right Front back Left Right Front Backmean 0.00 1.00 0.51 0.37 0.00 1.00 0.50 0.36SD 0.00 0.00 0.08 0.08 0.00 0.00 0.10 0.13

Table 4. Movement Time for "Yes" and Tone stimuli in the Location condition STIMULUS "Yes" Tone LOCATION Left Right Front Back Left Right Front Back

mean 944.28 983.86 1083.87 1192.03 1023.56 974.91 1066.58 1219.86SD 76.21 67.76 145.53 152.49 89.70 74.00 124.92 143.63

Table 5. Constant Error for "Yes" and Tone Stimuli in the Location condition STIMULUS "Yes" Tone LOCATION Left Right Front Back Left Right Front Backmean 3.33 8.96 1.53 3.53 3.30 4.95 0.39 8.61SD 2.72 4.59 0.87 1.96 1.55 3.21 1.24 2.98

Table 6. Reliability for "Yes" and Tone in the Location condition STIMULUS "Yes" "Tone" LOCATION Left Right Front Back Left Right Front Backmean 0.99 0.96 1.00 0.76 0.99 0.99 0.98 0.70SD 0.00 0.03 0.00 0.09 0.00 0.01 0.02 0.08

Page 99: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 84 -

Table 7. ANOVA table for Stroop score of Left and Right location spatial word stimuli and Tone stimulus SS

Degree of Freedom MS F p

Intercept 30836.1 1 30836.11 1.416538 0.264421 Error 195917.8 9 21768.65 Condition 64646.3 3 21548.76 2.470319 0.083359 Error 235522.8 27 8723.07 Location 48868.1 1 48868.15 2.185388 0.173448 Error 201251.8 9 22361.31 Cond v Loc 7301.1 3 2433.71 0.656172 0.586110 Error 100141.7 27 3708.95

Table 8. Tukey HSD test for each condition for spatial word stimuli and “Yes” stimulus Condition Cong Incong opp Incong cw Incong aw

Congruent (49.05) 0.998307 0.733202 0.095127 Incong opp (-44.05) 0.998307 0.824386 0.132501 Incong cw (-18.60) 0.733202 0.824386 0.510056 Incong aw (22.76) 0.095127 0.132501 0.510056

Table 9. ANOVA table for Stroop score of Left and Right location spatial word stimuli and Tone stimulus SS

Degree of Freedom MS F p

Intercept 30836.1 1 30836.11 1.416538 0.264421 Error 195917.8 9 21768.65 Condition 64646.3 3 21548.76 2.470319 0.083359 Error 235522.8 27 8723.07 Location 48868.1 1 48868.15 2.185388 0.173448 Error 201251.8 9 22361.31 Cond v Loc 7301.1 3 2433.71 0.656172 0.586110 Error 100141.7 27 3708.95

Table 10. Tukey HSD test for each condition for spatial word stimuli and Tone stimulus Condition Cong Incong opp Incong cw Incong aw

Cong (41.45) 0.998307 0.733202 0.095127 Incong opp (-41.45) 0.998307 0.824386 0.132501 Incong cw (-16.00) 0.733202 0.824386 0.510056 Incong aw (25.36) 0.095127 0.132501 0.510056

Page 100: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 85 -

Table 11. ANOVA table for Reaction Time of Left and Right congruence and “Yes” stimulus in the Location condition

SS

Degree of

Freedom MS F p Intercept 11596296 1 11596296 280.2188 0.000000 Error 372447 9 41383 Congruence 13506 2 6753 2.1281 0.148042 Error 57117 18 3173 Location 515 1 515 0.0313 0.863457 Error 147869 9 16430 Congr x Loc 6405 2 3202 1.5109 0.247417 Error 38151 18 2119

Table 12. Planned comparison table for RT of Left and Right congruence and “Yes” stimulus in the Location condition

Sum of

Squares

Degree of

FreedomMean

Square F p M1 13256.26 1 13256.26 2.602000 0.141187Error 45851.79 9 5094.64

Table 13. ANOVA table for Reaction Time of congruent and incongruent Left and Right and Tone stimulus in the Location condition

SS

Degree of

Freedom MS F p Intercept 11861884 1 11861884 315.1532 0.000000 Error 338746 9 37638 Congruence 29140 2 14570 5.2386 0.016106 Error 50063 18 2781 Location 195 1 195 0.0232 0.882379 Error 75868 9 8430 Congr x Loc 4694 2 2347 1.0956 0.355620 Error 38559 18 2142

Table 14. Planned comparison table for RT of congruent and Incongruent Left and Right and Tone stimulus in the Location condition

Sum of

Squares

Degree of

FreedomMean

Square F p M1 28890.48 1 28890.48 6.701727 0.029273Error 38798.11 9 4310.90

Page 101: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 86 -

Appendix F: Means and Standard Deviation Tables in Experiment Two

Page 102: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 87 -

Table 1. Reaction time for each stimuli in Non-rotated and rotated condition

Location Non-rotated Rotated

Left Right Front Back Left Right Front Back 475.10 504.14 824.58 724.17 490.03 484.05 712.86 701.57

42.38 50.38 88.83 108.04 48.35 42.76 74.97 91.89 379.24 390.19 623.63 479.76 380.66 387.31 543.28 493.70 570.96 618.10 1025.53 968.57 599.41 580.78 882.44 909.43

Word

Non-rotated Rotated Left Right Front Back Left Right Front Back

409.18 416.94 561.47 475.21 407.26 410.02 594.82 480.34 35.71 38.92 39.50 38.96 29.40 32.42 42.36 32.14

328.40 328.89 472.13 387.06 340.74 336.68 499.00 407.64 489.97 504.99 650.82 563.35 473.78 483.36 690.63 553.03

Table 2. Initial Rotation Direction for each stimulus in Non-rotated and rotated condition

Location Non-rotated Rotated

Left Right Front Back Left Right Front Back 0.05 0.95 0.54 0.42 0.06 0.97 0.25 0.77 0.03 0.03 0.07 0.08 0.03 0.02 0.08 0.06

-0.01 0.87 0.38 0.24 0.00 0.92 0.06 0.64 0.11 1.03 0.71 0.60 0.12 1.02 0.44 0.90

Word

Non-rotated Rotated Left Right Front Back Left Right Front Back

0.05 0.92 0.63 0.52 0.06 0.91 0.66 0.68 0.03 0.03 0.07 0.09 0.03 0.06 0.06 0.07

-0.01 0.85 0.47 0.31 0.00 0.77 0.52 0.52 0.11 0.99 0.79 0.73 0.12 1.05 0.80 0.84

Page 103: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 88 -

Table 3. Movement time for each stimulus in Non-rotated and rotated condition

Location Non-rotated Rotated

Left Right Front Back Left Right Front Back 1101.12 1208.17 1269.63 1523.56 1184.00 1155.87 1113.07 1448.67

77.84 91.53 157.51 144.09 74.92 89.91 99.40 120.88 925.03 1001.12 913.31 1197.60 1014.52 952.47 888.21 1175.22

1277.21 1415.23 1625.94 1849.52 1353.47 1359.27 1337.93 1722.12

Word Non-rotated Rotated

Left Right Front Back Left Right Front Back 1116.20 1104.26 1090.08 1527.44 1041.81 1218.26 1165.91 1585.44 112.32 92.63 90.20 107.00 100.69 115.05 99.38 112.87 862.11 894.71 886.03 1285.39 814.05 958.00 941.10 1330.10

1370.30 1313.81 1294.13 1769.49 1269.58 1478.52 1390.72 1840.78 Table 4. Constant Error for each stimulus in Non-rotated and rotated condition

Location Non-

rotated Rotated

Left Right Front Back Left Right Front Back -5.11 -2.59 -5.43 6.77 7.78 6.53 -9.46 5.67 2.91 1.85 2.88 2.45 1.71 2.02 5.97 4.89

-11.71 -6.78 -11.95 1.23 3.92 1.95 -22.97 -5.40 1.48 1.60 1.08 12.31 11.64 11.11 4.05 16.74

Word Non-

rotated Rotated

Left Right Front Back Left Right Front Back -2.42 4.24 -0.43 2.98 -3.29 3.15 0.53 2.98 1.27 1.69 1.00 2.03 1.26 1.84 1.06 1.88

-5.30 0.41 -2.70 -1.62 -6.13 -1.00 -1.87 -1.28 0.45 8.07 1.85 7.58 -0.45 7.30 2.93 7.24

Table 5. Reliability for each stimulus in Location and Word condition

Location Word Stim-Loc Stim-Loc

Left Right Front Back Left Right Front Back 0.99 0.99 0.95 0.88 1.00 1.00 1.00 1.00 0.00 0.01 0.03 0.06 0.00 0.00 0.00 0.00 0.98 0.97 0.88 0.74 1.00 0.99 0.99 0.99 1.00 1.00 1.02 1.02 1.00 1.00 1.00 1.00

Page 104: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 89 -

Appendix G: Statistical Analyses (2 condition × 2 rotation × 4 stimulus-location) in Experiment Two

Page 105: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 90 -

Table 1 ANOVA table for Reaction Time (2 condition x 2 rotation x 4 stimulus-location)

SS

Degree of

Freedom MS F p Intercept 46999413 1 46999413 139.3870 0.000001 Error 3034678 9 337186 Condition 842832 1 842832 16.7630 0.002700 Error 452515 9 50279 Stimulus 7542 1 7542 0.9204 0.362436 Error 73751 9 8195 Location 1489279 3 496426 26.4888 0.000000 Error 506007 27 18741 Cond v Stim 17877 1 17877 4.5506 0.061699 Error 35356 9 3928 Cond v Loc 193317 3 64439 7.1977 0.001061 Error 241725 27 8953 Stim v Loc 10826 3 3609 0.4454 0.722557 Error 218784 27 8103 Cond v Stim v Loc 37802 3 12601 1.8536 0.161339 Error 183547 27 6798

Table 2. Planned comparison table for RT of Left Right and Front Back in the Location condition

Sum of

Squares Degree of Freedom

Mean Squre F p

M1 1274757 1 1274757 32.76840 0.000285 Error 350118 9 38902

Table 3. Planned comparison table for RT of Left Right and Front Back in the Word condition

Sum of

Squares Degree of Freedom

Mean Squre F p

M1 274287.6 1 274287.6 28.08900 0.000494 Error 87884.5 9 9764.9

Page 106: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 91 -

Table 4. ANOVA table for Initial Rotation Direction (2 condition x 2 rotation x 4 stimulus-location)

SS

Degree of

Freedom MS F p Intercept 44.54445 1 44.54445 799.5347 0.000000 Error 0.50142 9 0.05571 Condition 0.10909 1 0.10909 4.9543 0.053057 Error 0.19817 9 0.02202 Stimulus 0.04823 1 0.04823 4.0116 0.076197 Error 0.10819 9 0.01202 Location 15.85218 3 5.28406 65.1005 0.000000 Error 2.19153 27 0.08117 Cond v Stim 0.00653 1 0.00653 0.5031 0.496108 Error 0.11683 9 0.01298 Cond v Loc 0.53087 3 0.17696 6.5520 0.001795 Error 0.72922 27 0.02701 Stim v Loc 0.77518 3 0.25839 12.3074 0.000029 Error 0.56686 27 0.02099 Cond v Stim v Loc 0.34554 3 0.11518 8.5175 0.000383 Error 0.36511 27 0.01352

Table 5. Planned comparison table for IRD of non-rotated Front and rotated Front in the Location condition

Sum of

Squares Degree of Freedom

Mean Squre F p

M1 0.426969 1 0.426969 29.44615 0.000418 Error 0.130500 9 0.014500

Table 6. Planned comparison table for IRD of non-rotated Back and rotated Back in the Location condition

Sum of

Squares Degree of Freedom

Mean Square F p

M1 0.612500 1 0.612500 16.57895 0.002792 Error 0.332500 9 0.036944

Page 107: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 92 -

Table 7. ANOVA table for Movement Time (2 condition x 2 rotation x 4 stimulus-location) SS

Degree of Freedom MS F p

Intercept 246350590 1 246350590 399.3251 0.000000Error 5552257 9 616917 Condition 14957 1 14957 0.0566 0.817309Error 2379110 9 264346 Stimulus 470 1 470 0.0077 0.932162Error 552539 9 61393 Location 4277459 3 1425820 9.2448 0.000225Error 4164215 27 154230 Cond v Stim 87569 1 87569 3.8904 0.080031Error 202579 9 22509 Cond v Loc 119327 3 39776 0.5954 0.623488Error 1803845 27 66809 Stim v Loc 26232 3 8744 0.2033 0.893176Error 1161086 27 43003 Cond v Stim v Loc 222568 3 74189 2.4077 0.089064Error 831946 27 30813

Table 8. ANOVA table for Constant Error (2 condition x 2 rotation x 4 stimulus-location)

SS

Degree of

Freedom MS F p Intercept 88.350 1 88.3505 0.886659 0.370970 Error 896.798 9 99.6442 Condition 8.112 1 8.1124 0.034147 0.857492 Error 2138.161 9 237.5734 Stimulus 157.718 1 157.7176 2.106758 0.180598 Error 673.764 9 74.8627 Location 1648.534 3 549.5114 7.449712 0.000868 Error 1991.595 27 73.7628 Cond v Stim 199.915 1 199.9151 2.443261 0.152467 Error 736.407 9 81.8230 Cond v Loc 864.293 3 288.0977 5.908909 0.003094 Error 1316.425 27 48.7565 Stim v Loc 391.292 3 130.4307 3.040652 0.046042 Error 1158.182 27 42.8956 Cond v Stim v Loc 599.249 3 199.7498 4.584624 0.010153 Error 1176.376 27 43.5695

Table 9. Planned comparison table for Constant Error of non-rotated Left-Right and rotated Front-Back in the Location condition

Sum of

Squares Degree of Freedom

Mean Square F p

M1 818.912 1 818.9122 5.229212 0.048030 Error 1409.430 9 156.6034

Page 108: Auditory Localisation: Contributions of Sound Location and ...eprints.qut.edu.au/16504/1/Norikazu_Yao_Thesis.pdf · sound is actually used, and whether it assists pure auditory localisation

- 93 -

Table 10. ANOVA table for Reliability (2 condition x 2 rotation x 4 stimulus-location)

SS

Degree of

Freedom MS F p Intercept 152.0003 1 152.0003 19825.37 0.000000 Error 0.0690 9 0.0077 Condition 0.0829 1 0.0829 9.51 0.013060 Error 0.0785 9 0.0087 Stimulus 0.0011 1 0.0011 0.39 0.549549 Error 0.0264 9 0.0029 Location 0.0764 3 0.0255 4.91 0.007495 Error 0.1400 27 0.0052 Cond v Stim 0.0025 1 0.0025 0.86 0.376963 Error 0.0256 9 0.0028 Cond v Loc 0.0727 3 0.0242 4.56 0.010369 Error 0.1435 27 0.0053 Stim v Loc 0.0044 3 0.0015 0.76 0.528601 Error 0.0522 27 0.0019 Cond v Stim v Loc 0.0047 3 0.0016 0.81 0.498462 Error 0.0516 27 0.0019

Table 11. Fisher LSD test for Reliability in each location in the Location condition. Error: Within MS = .00531, df = 27.000 Location "Left" "Right" "Front" "Back" Left (0.98861) 0.896740 0.124606 0.000078Right(0.98559) 0.896740 0.157470 0.000111Front(0.95207) 0.124606 0.157470 0.004897Back(0.88141) 0.000078 0.000111 0.004897