65
Final Project meet with me over the next couple of weeks to discuss possibilities

Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Final Project meet with me over the next couple of

weeks to discuss possibilities

Page 2: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations in visual working memory. Nature, 453, 233-235. Along with the Lewandowsky & Farrell chapters

Page 3: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Using Models to Test Hypotheses

Page 4: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Prototype Model Exemplar Model

Mixture Model

Page 5: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Prototype Model Exemplar Model

Mixture Model

compare nonnested models

compare nested models

compare nested models

Page 6: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Prototype Model Exemplar Model

Mixture Model

compare nonnested models

compare nested models

compare nested models

Saturated Model

Page 7: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Prototype Model Exemplar Model

Mixture Model

compare nonnested models

compare nested models

compare nested models

Saturated Model

Null Model

Page 8: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Some issues regarding fit measures

•  speed of computation •  consistency

– allow you to recover the true parameters? – all we’ve discussed are consistent

•  efficiency – minimum variance of parameter estimates? – SSE and %Var are inefficient –  lnL, χ2, weighted SSE are efficient

•  permit statistical tests

Page 9: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Prototype Model Exemplar Model

Mixture Model

compare nonnested models

compare nested models

compare nested models

Saturated Model

Null Model

Page 10: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Exemplar Model

Mixture Model

compare nested models

Page 11: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Exemplar Model

Mixture Model

compare nested models

Null Model

Saturated Model this has to be as good as any model can do …

Page 12: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Exemplar Model

Mixture Model

compare nested models

Null Model

Saturated Model this has to be as good as any model can do …

One parameter equal to each free data point

Page 13: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Exemplar Model

Mixture Model

compare nested models

Null Model

Saturated Model this has to be as good as any model can do …

this is not as bad as any model could do, but it’s a good floor

Page 14: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Exemplar Model

Mixture Model

compare nested models

Null Model

Saturated Model this has to be as good as any model can do …

this is not as bad as any model could do, but it’s a good floor

clearly, this model needs to fit better than the Null Model

Page 15: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Exemplar Model

Mixture Model

compare nested models

Null Model

Saturated Model this has to be as good as any model can do …

this is not as bad as any model could do, but it’s a good floor

clearly, this model needs to fit better than the Null Model

logically, this model MUST fit better than the exemplar model … does the exemplar model fit significantly worse?

Page 16: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Does a model have to account for all the variability in the observed data?

Page 17: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Does a model have to account for all the variability in the observed data? Does it need to fit as well as the saturated model? Well, that’s perhaps the ultimate goal (and having no free parameters to boot). But accounting for some of the variability is what it means to have a theory. You explain some of the variability. Better models explain MORE of the variability. And some of the variability could simply be noise (of various sorts). see Dell, G.S., Schwartz, M.F., Martin, N., Saffran, E.M.,

Gagnon, D.A. (2000). The role of computational models in neuropsychological investigations of language: Reply to Ruml and Caramazza (2000). Psychological Review, 107, 635-645.

Page 18: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations
Page 19: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

fit the exemplar model

fit the mixed model

does the exemplar model fit significantly worse than the exemplar model?

tests a hypothesis of whether people need do abstract a prototype on top of remembering specific exemplars

Page 20: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

Page 21: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

We will MAXIMIZE likelihood “Maximum Likelihood Parameter Estimation”

Page 22: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

FLln

RLln

ln L of the full model (e.g., mixed model)

ln L of the restricted model (e.g., exemplar model)

Page 23: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

FLln

RLln

RF LL lnln −

ln L of the full model (e.g., mixed model)

ln L of the restricted model (e.g., exemplar model)

difference between fit to full versus restricted model

Page 24: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

FLln

RLln

RF LL lnln −

]ln[ln2 RF LL −×

ln L of the full model (e.g., mixed model)

ln L of the restricted model (e.g., exemplar model)

difference between fit to full versus restricted model

need to multiply by 2 (because God said so)

Page 25: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

FLln

RLln

RF LL lnln −

]ln[ln2 RF LL −×

]ln[ln22RF LLG −×=

ln L of the full model (e.g., mixed model)

ln L of the restricted model (e.g., exemplar model)

difference between fit to full versus restricted model

need to multiply by 2 (because God said so)

log Likelihood ratio statistic distributed as χ2 with df = NparmsF - NparmsR

Page 26: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

]ln[ln22RF LLG −×= log Likelihood ratio statistic

distributed as χ2 with df = NparmsF - NparmsR

If that statistic exceeds the critical χ2 with the specified df (at selected alpha level), then the restricted model fits significantly worse than the general model

Page 27: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

]ln[ln22RF LLG −×= log Likelihood ratio statistic

distributed as χ2 with df = NparmsF - NparmsR

EXAMPLE

12.263ln −=FL12.293ln −=RL

19545240 =−=df

Page 28: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

]ln[ln22RF LLG −×= log Likelihood ratio statistic

distributed as χ2 with df = NparmsF - NparmsR

EXAMPLE

12.263ln −=FL12.293ln −=RL

40.61]82.29312.263[22 =−−−×=G

19545240 =−=df

6.228)05.,195(2 === αχ dfC

Page 29: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations
Page 30: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

]ln[ln22RF LLG −×= log Likelihood ratio statistic

distributed as χ2 with df = NparmsF - NparmsR

EXAMPLE

12.263ln −=FL12.293ln −=RL

40.61]82.29312.263[22 =−−−×=G

19545240 =−=df

6.228)05.,195(2 === αχ dfC

the restricted model DOES NOT fit significantly worse …

Page 31: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

instead of SSE or r2

]ln[ln22RF LLG −×= log Likelihood ratio statistic

distributed as χ2 with df = NparmsF - NparmsR

Likelihood Ratio Testing

⎟⎟⎠

⎞⎜⎜⎝

⎛×=

R

F

LLG ln22

Page 32: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Likelihood (L) log Likelihood (ln L)

What is Likelihood? I’ll start with just giving you the equations … more later (so you can do the homework)

Page 33: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

ln L We will stay concrete for now. There are n stimuli and m responses. For identification, each stimulus has a unique response. For categorization, groups of stimuli can have the same (category) response.

Page 34: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

ln L We will stay concrete for now. There are n stimuli and m responses. For identification, each stimulus has a unique response. For categorization, groups of stimuli can have the same (category) response.

NOTE: I will be giving you a maximum likelihood equation that can be used with this kind of choice data. Maximum likelihood methods are extremely general (and can get rather complicated). Maximum Likelihood techniques are not only used for evaluating computational models, but are also used widely in statistics.

Page 35: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

ln L We will stay concrete for now. There are n stimuli and m responses. For identification, each stimulus has a unique response. For categorization, groups of stimuli can have the same (category) response. In order to use the following form of maximum likelihood statistic, we need to have data in the form of response frequencies, not response probabilities (that limitation is only true for this example).

Page 36: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

ln L We will stay concrete for now. There are n stimuli and m responses. For identification, each stimulus has a unique response. For categorization, groups of stimuli can have the same (category) response. In order to use the following form of maximum likelihood statistic, we need to have data in the form of response frequencies, not response probabilities.

ijf

iN)|( ij SRP

observed frequency with which stimulus i is given response j

number of presentations of stimulus i

predicted probability with which stimulus i is given response j

Page 37: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

ln L

ijf

iN)|( ij SRP

observed frequency with which stimulus i is given response j

number of presentations of stimulus i

predicted probability with which stimulus i is given response j

∑ ∑∑ ∑∑+−=i i j i j

ijijiji SRPffNL )|(ln!ln!lnln

∏=

=N

iiN

1

! ∏=

=N

iiN

1

ln!ln ∑=

=N

iiN

1ln!ln

Page 38: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

ln L

ijf

iN)|( ij SRP

observed frequency with which stimulus i is given response j

number of presentations of stimulus i

predicted probability with which stimulus i is given response j

∑ ∑∑ ∑∑+−=i i j i j

ijijiji SRPffNL )|(ln!ln!lnln

imii fim

fi

fi

i imii

i SRPSRPSRPfff

NL )|()|()|( 21

2121

⋅⋅⋅⎟⎟⎠

⎞⎜⎜⎝

⋅⋅⋅=∏

Page 39: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

∑ ∑∑ ∑∑+−=i i j i j

ijijiji SRPffNL )|(ln!ln!lnln

imii fim

fi

fi

i imii

i SRPSRPSRPfff

NL )|()|()|( 21

2121

⋅⋅⋅⎟⎟⎠

⎞⎜⎜⎝

⋅⋅⋅=∏

Why are we taking logs?

Page 40: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Why are we taking logs?

)()()log()log()log(

)log()log()/log()log()log()log(

))(log(

)(

xfexfeapa

babababa

xf

xf

p

=

=

×=

−=

+=×

Page 41: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Why are we taking logs?

log(f(x)) is a “monotonic function” so max[f(x)] is the same as max[log(f(x))]

Page 42: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

ln L

ijf

iN)|( ij SRP

observed frequency with which stimulus i is given response j

number of presentations of stimulus i

predicted probability with which stimulus i is given response j

∑ ∑∑ ∑∑+−=i i j i j

ijijiji SRPffNL )|(ln!ln!lnln

You want to MAXIMIZE the lnL … which is the same as MINIMIZING the –lnL …

Page 43: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations
Page 44: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

First, let’s talk about probability

Prob(data|parm) probability of some data given the parameters of some model

knowing parameters à predict some outcome

Page 45: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.4

obviously, the probability of getting a head is .6

Page 46: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.4

what is the probability of getting two heads on two flips?

coin flips are independent

Page 47: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.4

what is the probability of getting two heads on two flips?

coin flips are independent

p(event A AND event B) = p(event A) x p(event B) if A and B are INDEPENDENT

p(head) x p(head) .6 x .6 .36

Page 48: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.6

what is the probability of getting one head and one tail on two flips?

Page 49: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.6

what is the probability of getting one head and one tail on two flips?

p(head) x p(tail) .6 x .4 .24

+ p(tail) x p(head) .4 x .6 .24

.48

Page 50: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.6

what is the probability of getting x heads on N flips?

Page 51: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.6

what is the probability of getting x heads on N flips?

Prob(x | p) = Nx

!

"#

$

%& px (1' p)N'x = N!

x!(N ' x)!px (1' p)N'x

Binomial Distribution f(x; p) give the probability of observing x “successes” for a Bernoulli process with probability p

Page 52: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations
Page 53: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Matlab Example

Page 54: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

coin flips have only two outcomes (heads or tails) that is, a flip of the coin can have only two mutually exclusive events … what about a situation with more than just two possible outcomes? what if an event has three possible outcomes? or four possible outcomes?

Page 55: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Multinomial Distribution probability of outcome 1 is p1 probability of outcome 2 is p2 probability of outcome 3 is p3 we want to know what the probability of observing x1 events with outcome 1, x2 events with outcome 2, and x3 events with outcome 3

Prob(x1, x2, x3 | p1p2p3) =N

x1 x2 x3

!

"##

$

%&& p1

x1p2x2 p3

x3 =N!

x1!x2 !x3!p1x1p2

x2 p3x3

Page 56: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Multinomial Distribution probability of outcome 1 is p1 probability of outcome 2 is p2 probability of outcome 3 is p3 we want to know what the probability of observing x1 events with outcome 1, x2 events with outcome 2, and x3 events with outcome 3

Prob(x1, x2,..., xm | p1p2,..., pm ) =N

x1 x2 ... xm

!

"##

$

%&& p1

x1p2x2 ' ' ' pm

xm =N!

x1!x2 !' ' ' xm!p1x1p2

x2 ' ' ' pmxm

Page 57: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

Matlab Example

Page 58: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

What is Likelihood?

prob(data|parm) probability of some data given the known parameters of some model

L(data|parm) likelihood of known data given particular candidate parameters of the model

know parameters à predict some outcome

observing some data à estimate some parameters that maximize the likelihood of the data

Page 59: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p?

L(x | p) = Prob(x | p) = Nx

!

"#

$

%& px (1' p)N'x = N!

x!(N ' x)!px (1' p)N'x

now, x (and N) are fixed we want to find the value of p that maximizes the likelihood L of the data

Page 60: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations
Page 61: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations
Page 62: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p? USING CALCULUS

L(x | p) = Prob(x | p) = Nx

!

"#

$

%& px (1' p)N'x = N!

x!(N ' x)!px (1' p)N'x

lnL = log Nx

!

"#

$

%&

!

"##

$

%&&+ x ln p+ (N ' x)ln(1' p)

d lnLdp

= x 1p

!

"#

$

%&+ (N ' x) 1

1' p!

"#

$

%&('1) = 0

x(1' p)' (N ' x)p = 0x ' xp' Np+ xp = 0

p = xN

Page 63: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p? USING CALCULUS

L(x | p) = Prob(x | p) = Nx

!

"#

$

%& px (1' p)N'x = N!

x!(N ' x)!px (1' p)N'x

lnL = log Nx

!

"#

$

%&

!

"##

$

%&&+ x ln p+ (N ' x)ln(1' p)

d lnLdp

= x 1p

!

"#

$

%&+ (N ' x) 1

1' p!

"#

$

%&('1) = 0

x(1' p)' (N ' x)p = 0x ' xp' Np+ xp = 0

p = xN

Page 64: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p? USING CALCULUS

L(x | p) = Prob(x | p) = Nx

!

"#

$

%& px (1' p)N'x = N!

x!(N ' x)!px (1' p)N'x

lnL = log Nx

!

"#

$

%&

!

"##

$

%&&+ x ln p+ (N ' x)ln(1' p)

d lnLdp

= 0+ x 1p

!

"#

$

%&+ (N ' x) 1

1' p!

"#

$

%&('1) = 0

x(1' p)' (N ' x)p = 0x ' xp' Np+ xp = 0

p = xN

Page 65: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations

imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p? USING CALCULUS

L(x | p) = Prob(x | p) = Nx

!

"#

$

%& px (1' p)N'x = N!

x!(N ' x)!px (1' p)N'x

lnL = log Nx

!

"#

$

%&

!

"##

$

%&&+ x ln p+ (N ' x)ln(1' p)

d lnLdp

= 0+ x 1p

!

"#

$

%&+ (N ' x) 1

1' p!

"#

$

%&('1) = 0

x(1' p)' (N ' x)p = 0x ' xp' Np+ xp = 0

p = xN