55
3 MASSACHUSETTS LNIV AMHERST LAB OF PSYCHOMETRIC AND -- ETC F/G 12/1 BAYESIAN ESTIMATION IN THE ONE-PARA1ETER LATENT TRAIT MODEL. (U) MAR 80 H SWAMINATHAN, J A GIFFORD N0001- -79C0039 UNCLASSIFIED LR-106N L 'EhlhlhElhEEllI IIIIIIIIIIIIIIfll.. llllllll~ENl IIIIIII

AMHERST UNCLASSIFIED LR-106N IIIIIIIIIIIIIIfll.. llllllll~ENl IIIIIII · 3 massachusetts lniv amherst lab of psychometric and --etc f/g 12/1 bayesian estimation in the one-para1eter

  • Upload
    lynhi

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

3 MASSACHUSETTS LNIV AMHERST LAB OF PSYCHOMETRIC AND -- ETC F/G 12/1BAYESIAN ESTIMATION IN THE ONE-PARA1ETER LATENT TRAIT MODEL. (U)MAR 80 H SWAMINATHAN, J A GIFFORD N0001- -79C0039

UNCLASSIFIED LR-106N L

'EhlhlhElhEEllIIIIIIIIIIIIIIIfll..

llllllll~ENl

IIIIIII

1.5 _____1.4

MICROCOPY RESOLUTION TEST CHART

VVNN lfl' ,. ['

COD BAYESIAN ESTIMATION IN THE ONE-PARAMETER

LATENT TRAIT MODEL

0

wtHAR IHARAN SWA I NATHAN

AND

JANICE A. GIFFORD

Research Report 80-1March 1980

Laboratory of Psychometric andEvaluative Research (LR 106)

School of EducationUniversity of Massachusetts

Amherst, MA 01003

Prepared under contract No. NOOO14-79-C-0039, NR150-427with the personnel and Training Research Programs

Psychological Sciences DivisionOffice of Naval Research

Approved for public release; distribution unlimited.Reproduction in whole or in part is permitted for

any.purpose of the United States Government.L84

80 *6 24 008

Unclassified'9ECUAITY CLASSIFICATION OF THIS PAGE (tm Da d) Data900"00

REPOR DOCUENTAION PGEEAD INSTRUCTIONSREPORT DOCUMENTATION PAGE BEFORE COMPLETING FORM

I. REPORT NUMBER 12. GOVT ACCESSION NO. 3. RECIPIENT'S CATALOG NUMBER

Technical Report 80-1=. " % ~~~TITLE rf,d S..h,,I. ..... *I ,T~ r r .. _.=--'EE

BAYESIAXSTIHATION IN THE O-ZAA ER Trp--

!.ATENTjIT MOECID S ,. RFORMING ORG. REPORT NUMBER

Lab. Report 106

h a.n a*., .~J i e A / f~ro rd",-tA C O N T R A C T O R G R A N T N U M ' E ()nathan J ce ifford4-79-

9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK

Laboratory of Psychometric and Ev/luative Researih AREA I WORK UNIT NUMBERS

S~hgl Lnf ... /U.ixe. -ty of/assachusettwA t'NR150-427-- e r s , 4 A 0 1 0 0 3 I

11. CONTROLLING OFFICE NAME AND ADDRESS

Personnel and Training Research Programs MarOffice of Naval Research (Code 458) 2. iS

Arlington, VA 22217 3614. MONITORING AGENCY NAME A ADDRESS(if dllleet from Controlllng Office) IS. SECURITY CL*ISS- (ol this report)

UnclassifiedIS&. DECLASSIFICATION/DOWNGRADING

SCHEDULE

16. DISTRIBUTION STATEMENT (of

Approved for public release; distribution unlimited.

17. DISTRIBUTION STATEMENT (o the abetrct entered In Block 20. It different irom Repot)

'S.1+ L-Ij ~R~-A--

IS. SUPPLEMENTARYeWS'

This research was jointly supported by the Air Force Human Resources

Laboratory and by the Office of Naval Research.

IS. KEY WORDS (Continue an reverse aide if neseewar od Identify by block numbef)

latent trait theoryBayesain estimation

20. ABSTRACT (Continue on reveso aide If neceaar and identlfy by Nock mambe)

,-When several parameters are to be estimated simultaneously, and when both

structural and incidental parameters have to be estimated, a Bayesian solutionto the estimation problem may be appropriate. This is the case in latent traitmodels, where the "structural" parameters are item parameters, while the"incidental parameters" are ability parameters since these increase withoutbound as the numbers of examinees is increased to provide stable estimates ofthe item parameters. Bayesian estimates for the parameters in the one-paramete

DD I,=1 1473 EDITION OF I NOV S IS OBSOLETEUS/ 0102-LF.014401 Unclassified

SCITVCL AIFICATION OF THIS P AGE

UnclassifiedSCURITY CLASSIICATION OF THIS PASS 1M& SnO

latent trait model were obtained for two cases) -

/1 ,Conditionalestiuation of ability (for those situations when itemsare previously calibrated), and.

(2 ,doint estimation of item and ability parameters.

For each of the two cases, a simulation study was carried out to studythe efficacy of the two Bayesian procedures described and to compare theBayesian estimates with the comparable maximum likelihood estimates. <The..

Bayesian and maximum likelihood estimates were compared with respect to:(a).the mean value of the estimates, as compared with the mean values of thetrue values; (b) the mean squared error difference between true values andestimated values; and (c) the regression of the true value on the estimatedvalue. Overall, the results favored the Bayesian estimates; the means ofthe estimates are closer to the means of the true values; the slopes andintercepts are in general closer to one and zero respectively; and the meansquare deviations are dramatically smaller (in some cases, one-tenth thesize of those for ML estimates).

UnclassifiedSECURITY CLASIICATION OF ThIS PAGSI(Whbe Diee RfmQ j

TABLE OF CONTENTS

Page

INTRODUCTION . 1

Bayesian Procedures .. ..................... 3

Bayesian Estimation in the One-ParameterLogistic Model .. ................... .... 7

The Model. .. ........................ 7Conditional Estimation of Ability...............8Joint Estimation of Item and Ability Parameters. .. ..... 13

Large Sample Properties of thePosterior Distribution. .. ................. 18

COMPARISON STUDIES .. .. ................... 23

DISCUSSION .. .. ........................ 31

References .. .. ........................ 34

iT

L:nnG

Bayesian Estimation in the One-ParameterLatent Trait Model1'2

INTRODUCTION

In recent years there has been considerable interest among measurement

theorists and practitioners in latent trait theory since it offers the

potential for improving educational and psychological measurement practices.

However, before latent trait theory can be successfully applied to solve

existing measurement problems, the problem of estimating parameters in

latent trait models has to be addressed.

The literature in latent trait theory abounds with procedures for the

estimation of parameters. The estimation procedures that have been devel-

oped over the past thirty years range from heuristic procedures such as

those given by Urry (1974) and Jensema (1976) to conditional as well as

unconditional maximum likelihood procedures (Andersen, 1970, 1972, 1973a,

1973b; Bock, 1972; Lord, 1968, 1974; Samejima, 1969, 1972; Wright &

Panchapakesan, 1969; Wright & Douglas, 1977). With the exception of the

"conditional" maximum likelihood procedure provided by Andersen (1970)

for the one-parameter model, the maximum likelihood estimators of the

parameters in the latent trait models are less than optional as a result

1The research reported here was performed pursuant to Grant No.N0014-79-C-0039 with the Office of Naval Research and to Grant No.FQ 7624-79-0014 with the Air Force Human Resources Laboratory. Theopinions expressed here, however, do not reflect the positions or poli-cies of these agencies.

2The author is grateful to the encouragement and support provided

by Dr. Malcolm Res of the Air Force Human Resources Laboratory, and toDr. Charles Davis of the Office of Naval Research.

__ Z.a

-2-

of the problem of estimating "structural parameters" in the presence of

"incidental parameters" (Andersen, 1970; Zellner, 1971, pp. 114-154).

The "structural parameters" in latent trait models are the item parameters

while the "incidental parameters" are the ability parameters since these

increase without bound as the number of examinees is increased to pro-

vide stable estimates of the parameters. Furthermore, as Novick, Lewis,

and Jackson (1973) have remarked, "in the estimation of many parameters

some, by chance, can be expected to be substantially overestimated and the

others substantially underestimated."

When several parameters have to be estimated simultaneously, and

when, as in the present case, both structural and incidental parameters

have to be estimated, a Bayesian solution to the estimation problem may

be appropriate (Zellner, 1971, pp. 114-119). This is particularly true

if prior information orbelief about the parameters is available, since

in this case, the incorporation of this information will certainly increase

the "accuracy" or the meaningfulness of the estimates. An example of

this was encountered by Lord (1968), where in order to prevent estimates of

the item discrimination parameter from drifting out of bounds, it was

necessary to impose limits on the range of values the parameter could

take. Although the estimation procedure employed by Lord (1968) was not

Bayesian, this illustrates the role of prior information in obtaining

meaningful estimates. A further argument that can be advanced in

favor of a Bayesian approach is that the logic of the Bayesian infer-

ential procedure is more appealing than the classical, sampling theoretic,

inferential procedure. As Zellner (1971, p. 362) has pointed out,

"...there is no need to justify inference procedures in terms of their

-3-

behavior in repeated, as yet nobez'ved,1 samples as is usually done in the

sampling theory approach." Consequently, it is possible to make proba-

bilistic statements about the parameters themselves, based on the infor-

mation that is available.

Bayesian Procedures

It may be instructive to review briefly the Bayesian estimation

procedure. Let p(y, 8) denote the joint probability density function (pdf)

for a random observation vector y and a parameter vector 0, also random.

Then,

p(Y-, e) - p(Yjl) p( 8)

p (..Y-) p (Y)where

p(eIY) - p(_) p(ZI)/p(y)or,

[(1] P(_) P(D) p(Y-1o

since p(y) # 0 is a constant. Equation [1] is the essence of Bayes'

Theorem and is of primary importance in the estimation of parameters

and for drawing inferences concerning the parameters. The probability

density function p(61y) is the posterior pdf for the parameter vector e,

given the sample information or data, and p(S) is the prior pdf for thevector e. The quantity p(iI.) is a proper pdf as long as is a random

variable. However, the moment the vector y is realized, p(yje) ceases

IThe italics have been provided by the authors.

-4-

to have the interpretation as a pdf. In this case, p(lj_) is strictly a

mathematical function of 6, well known as the likeZihood function.

Since the notation p(y]8) can be mistaken for a pdf, the likelihood

function is often written as L(QJ 8), and sometimes, to emphasize the

fact that it is a function of 6, as L(81I). Thus, the expression given

in [1] can be written as

[2] p(.1I) -L(B1) pC()

It is interesting to note that if p(e) is assumed to be a constant,

i.e., the prior belief about 8 is summarized via a uniform distribution,

the posterior pdf of 8 is proportional to the likelihood function. In a

sense, this interpretation constitutes a Bayesian justification of

maximum likelihood principle.

Once the prior belief about the parameter e is specified, the joint

posterior pdf of the vector 8 given the data can be readily obtained. The

posterior pdf of 8 contains all the information necessary for drawing

inferences concerning 8 (Jointly or individually) and for obtaining esti-

mtes of 6 once a "loss function" is prescribed. For instance, if a squared-

error loss function is deemed appropriate, then the mean of the posterior

pdf of 6 can be taken as the estimator of e. On the other hand, if a

zero-one loss function is appropriate, then, the mode of the posterior

pdf of 8 is the estimator of 8. Similarly, for the absolute deviation

loss function, the median of the posterior pdf of 6 is the appropriate

estimator.

The Bayesian procedure described above has been successfully applied

in a variety of situations. For a sampling of these applications the

reader is referred to Novick and Jackson (1974), and Zellner (1971).

~-5-

However, Bayesian methods have found only a limited application in the

area of latent trait theory. Birnbaum (1969) obtained Bayes estimates of

the ability parameter in the one- and two-parameter logistic models under

the assumption that the item parameters were known. He chose, for mathe-

matical tractability, the prior pdf of 0i, the ability of the ith

examinee, to be the logistic density function, i.e.

p(Oi) - exp (-DOi)/[l+exp(-DOi)]2

where D=1.7 is a scaling factor. Owen (1975), in applying the latent

trait model in an adaptive testing context, obtained Bayes estimates of

ability, 01, under the assumption that the prior pdf of ei was normal

with mean, zero, and variance, unity.

The Bayesian procedure suggested by Birnbaum (1969) and Owen (1975)

require rather exact specification of prior belief.1 An alternative and

a more powerful procedure has been suggested by Lindley (1971). He has

shown that if tte information that is available can be considered exchange-

able, then a hierarchical Bayesian model can be effectively employed for

the estimation of parameters.

In order to illustrate the hierarchical model, let us consider

the problem of estimating, say, the ability 61 of an individual (i-1, ... , N).

If it can be assumed, apriori, that exchangeability holds, i.e., the

information about ei is no different from the information about any other

ej, observed or yet to be observed, then, ei can be assumed to be a random

sample from some distribution, p(e). For convenience, if p(e) is taken

IMeredith and Kearnes (1973) and Sanathanan and Blumenthal (1978)

have obtained empirical Bayes estimators of the ability parameters forthe one-parameter model. In these procedures the prior pdf is estimated

k from the data.

I L W l ,. . . . a . e"-,. .... .

-6-

to be normal with mean U and variance a2, then this would constitute

specification of the first stage of the hierarchical model. Since v

and Y2 are unknown, specifying prior beliefs on these "hyperparameters"

would constitute the second stage of the hierarchical model. Usually,

the hyperparameter distributions are specified in such a way that they

depend upon constants which can be determined from the prior belief the

investigator has about the parameters, and hence the hierarchical model

terminates at the second stage. With this two stage model, it is possible

to estimate 0i (i=l, ..., N) without any reference to the nuisance param-

eters, p and a2 .

Novick (1971) has described this hierarchical model as an analog of

the empirical Bayes procedure advocated by Robbins (1955) and the simul-

taneous estimation procedure provided by Stein (1962). Furthermore, as

Novick, Lewis, and Jackson (1973) have pointed out, this procedure not

only employs the direct information gained through the observation of

an individual, but also the collateral information contained in observations

from other individuals. They further note that, "In effect, this collateral

information is used to provide 'prior' information for the estimation....

Thus to some extent, the problem of selecting prior distributions for

Bayesian analyses is neutralized, and this is effected from a strictly

Bayesian approach."

The hierarchical Bayesian model has been successfully employed by

Lindley and Smith (1972), Novick et al. (1973), and Zellner (1971), to

name a few. However, this approach has not been employed for estimating

parameters in latent trait models. The purpose of this paper, hence, is

to provide a Bayesian estimation procedure, in the sense of Lindley, for

estimating parameters in the one-parameter latent trait model.

-7-

Bayesian Estimation in the One-Parameter Logistic Model

The Model

Let Xij denote a random variable that represents the binary response

of an examinee i (i-1, ..., N) on item J (J-l, ..., n). If the examinee

responds correctly to the itemXij-l, while for an incorrect response,

Xij=O. We assume that the complete latent space is unidimensional, and

that the probability, P[Xij-l], that an individual with ability parameter

ei will correctly respond to an item with difficulty parameter, bj, is

given by the logistic model,

[3] P[Xijlli] ff= exp(6i-bj)/{l+exp(Oi-bj)}.

On the other hand, the probability that the individual will respond

incorrectly is given by

- [4] P[Xij=Oit] = i - P[Xij-llei

= i/{l+exp(Bi-bi)}

The probabilities given in Equations [3] and [4] can be combined to yield

[5] P[XiJ - xijj L] - exp{xij(ei-bj)}/{l+exp(Gi-bj)}

where xii-1 for a correct response and xij=0 for an incorrect response.

The above model, since it depends only on one item parameter,

difficulty, is commonly known as the Rasch model or the one-parameter

logistic model. For a detailed description of this model and its prop-

erties, the reader is referred to Wright (1977).

-8-

Conditional Estimation of Ability

In some situations it may be of interest to estimate the ability Oi

of an examinee who takes a test which has been calibrated, i.e., the

difficulty parameters are known. Moreover, since the problem of esti-

mating ability when the item parameters are known is simpler to deal

with and provides an illustration of the basic ideas involved, this case

will be dealt with in detail first.

The model given by Equation [5], should in the strict sense be

expressed as

[6] P[Xij = xijlei,bjl = exp{xij(ei-bj)}/{1+exp(ei-bj)}

Although there are several ways to write the model, the expression given

by [6] is the most convenient for the present situation.

It follows, from the principle of local independence, that the

joint probability of responses of the N examinees on n items is given by

[7] P[XlfXl1,Xl2=x12,... ,Xij=xij,... ,XNn=xNnl8,e 2 , .... 9N;bl,b 2 ... ,bn]

N n=fil 11 exp{xij(6i-bj)}/{l+exp(ei-bj)}

i=l jil

Once the responses of the N examinees on the n itemsare observed, the

above expression ceases to have the probability interpretation and becomes

the likelihood function, L(X - xle, _). Upon simplification,

[8] L(X=x,b__) = exp{J I xij(6i-bj)}/[ E{(l+e ' 1 -hI)}i j

W exp{i riei - I qjbj}/H n{(l+exp(ei-bj)}i j

where ri = I x i j and qj = xii. Since the item parameters are

. ..

V

-9-

known constants, the likelihood function is strictly a function of 6

and, hence, can be expressed as

[9] L(xl, b) - exp{j ri~i}/ f{l+exp(Oi-bj))

i

Returning to Equation [1], we see that in order to obtain the

posterior density function of 8 given the observations and the item

parameters, it is necessary to specify the prior distribution of 6. To

this end, in the first stage of the hierarchical model, we assume that,

apriori, the ability parameters, 61, are independently and identically

normally distributed, i.e.,

S[101 eiju,o n p,.

The assumption that the thetas are independently and identically distri-

buted follows from the assumption of exchangeable prior information about

the thetas. The assumption of normality also appears to be reasonable

and has been made by numerous authors, e.g., Lord and Novick (1968).

In order to complete the hierarchical Bayesian model, we have to

specify prior distributions for p and 0. This is the second stage. At

this level, we assume that, apriori, p and * are independently distributed,

and that p has the uniform distribution. Thus,

[1h p6(,) - p(O).

The uniform distribution is not a proper distribution, although this

choice can be justified to some extent (Zellner, 1971, pp. 41-43). It

may, however, be more appropriate to specify a "non-diffuse" prior and

this possibility will be explored further in a later paper.

-10-

It now remains to specify the form of p(f). Since * is thevariance of 01, f can be assumed to have the inverse chi-square, X-2

distribution, i.e.,

[121 p(#IV, S2) 2 exp(-VS2/2f).

The quantities v and s2 are parameters of the inverse chi-square dis-

tribution, and have to be specified apriori. The inverse chi-square

distribution can be expressed in different ways. Novick and Jackson

(1974) prefer the form

p(4 V, X) f 2 exp(-X/20).

For this form, the mean of the distribution is X/(v-2) and the mode is

A/(v+2). For the form given by Equation [12] the mean is s2v/(v-2) and

the mode is s2v/(v+2), with both mean and mode approaching s2 as v

increases. These two forms are clearly equivalent, but the form given

by Equation [12] is employed in the sequel because it provides a direct

interpretation of the parameter v and s2 . The quantity s2 thus represents

the investigator's belief about the "typical" value of the parameter

while v represents his/her degree of confidence.

The joint posterior distribution of 0' - [ 1 , e2, ..., ON] given

b and the item responses is given by

[13] p(61bL, _) - L(xj, b_) p(Ojjj,¢)p(u, ).

The likelihood function L(xj2,b) is given by Equation [9], p(p,*) by

Equation [12], and

-11-

N 1[14] p(el,*) - in1 - exp{- (ei-U)2/01

Nexp{- (ei-v) 2/20}

i-i

Combining these expressions, we have,

[15] S(bx,o,v, s2) c [exp{. ri}/IT II {1+exp(ei-bj)1]i it

[ 'Z exp {- e C( i- 20}][O exp(-vs2/20)]

i

The above expression depends upon the "nuisance" parameters p and 0 and

hence these have to be integrated out. Since .(ei-p)2 = I(ete.)2 + N(e.-U)2,

and

_CD exp -{N(G.-)2/20}d 0 if

integration with respect to v yields

[16] p(_ebx,0,vs 2) L(xJ_,b_) *-(N+V+l)/2 exp[-{vs2+.(6i_.) 2}/20]

Noting that

f m exp(-k/O)do k

and integrating with respect to *, we obtain

[17] p(_b,x,v,s2) - L(x0,_b) (vs2 + (6-e.)2}-(N+v-l)/2

ii

(18] = [exp{ reil l{l+exp(ei-bj)}]i i

.{*v2 + I (e )2 -(N+--e)/2

i i-f~s -

-12-

The Joint posterior modes are obtained by differentiating log

p(elk,xE) with respect to e, setting these derivatives equal to zero,

and solving the resulting equations:

[19] P pj -= ei y (i=l, .. ,N)

where

ij exp(Oi-bj)/{l+exp(ei-bj)}

andN

a2 [ vs8+ (6i6.)2 }/(V+N-l)

Since this system of equations is non-linear, numerical procedures have

to be employed. The Newton-Raphson iterative procedure is ideally suited

for this situation. Let

n[20] W) P + (0 _e )/ar2

-ri

Then

n[21] f'(e6 P (1-P )+ {02(1_ 1) -2(0 -e)/(v+N-l)}/(aY2)2.

i j=l ij i

if 8'6 is the value of e at the kth iteration, then 6(kl is given byi i i

[22] e (k+l)= e (k) _ f( 8 (k) )/fI( 6 (k))

i i i i

with e(o) the starting value being given by (Wright & Douglas, 1977),

[23] 6(o) -b + {l+s2/2.89 }log (ri/n-ri)ib

where

-~ ~ b/n an s - Cjb) 2/(n-1)

bil n n ab IbJb

Zr

-13-

Although the iterative scheme given in [22] is for estimating the

ability e for each individual, in reality, only the ability correspond-

ing to each raw score r (r-1, ... , n-l) need be estimated. The ability

corresponding to raw score r=O and r-n cannot be estimated by virtue of

[23]. Hence, individuals who obtain perfect score or zero score are

eliminated from the analysis. It should also be pointed out the Newton-

Raphson scheme given above is not the vector version of the procedure

since for this procedure the matrix of derivatives {af/9 1aO3} has to

be computed and inverted. The procedure described here worked sufficiently

well, converging in as few as three to four iterations.

Joint Estimation of Item and Ability Parameters

The case considered above, where the item parameters were assumed

to be known, provides the necessary background for the Bayesian estima-

tion procedure. However, this situation may not be realistic and,

hence, it is necessary to develop a procedure for the joint estimation

of the item and ability parameters.

We proceed in the manner indicated for the case of known item

parameters. Hence, in addision to making the assumptions about the

ability parameters, we have to make assumptions regarding the item param-

eters. Again, as in the previous case, we specify prior beliefs about

the parameters in two stages. In the first stage, for the model given

in [5], we assume:

[24a] eili e *o ' N(1Iee), (il, ... , N)

[24b] b j1"bO b NO (b, ob). (J-1,., n)

-14-

In addition, we assume that, apriori, ei and b are independent, ek and

61 (k#1) are independent, and bk and bt are independent.

As for the ability parameters, the specification of prior belief

about b seems reasonable, especially if an item bank is available. This

assumption has been made by several authors (Lord & Novick, 1968; Wright

& Douglas, 1977). Furthermore, as a result of the hierarchical Bayesian

model, departures from this assumption appear to have a negligible

effect on the estimates of b .

For the second stage, we assume that

[25a] p(pS,¢oe) p()

-(ve/ 2+1)c exp(-s 2 v /20 )

and

[25b] P(vb,Ob) POO

-(\%/2+1)

exp(-SVb/20b)"

We have thus assumed that, apriori, the hyperparameters are independent,

and that the prior information about the parameters, 11, and 1b' is

"vague".

The joint posterior pdf of 6, and b, is given by

[26] p(e,bx,g, €,Ab,$b,Ve,s 2 , s2)

N nL(O,bjx){ H p(61) H p(b )}P(O0) p(¢b)

i-l j=l

w1' re L(6,blx) is the likelihood function given by [8]. Now

2 -(N+v e+2)2 1 2/20 )2/20[271 R T P(e t)} p(00) =€exp(-VeS,/2 e) exp{-(e t-Ve)22e)

...

-15-

Upon integrating with respect to and u., we have, from [17]

C N(28] f f { H p(i)1 p(eo)dle doe-" 0 j=1

Nc [V S2 + (e e.)-(N+v-l)/2.

Similarly,

n

[29] fO f0 {j- p(bj)} p(Ob)db debn

[vbS 2+ I (b - b.)2]-(n+vb-l)/2

J-

Combining [26], [28] and [29], we obtain the joint posterior density of

e and b:

[301 p(8,blx,vgSV , Vb,S2)

N N[{exp( I rg6i)){vgS + I (e -8)2)-(N+ve-l)/2]

i-l ii .

n n(exp(- I q b )}{fv bS2 + E (bj-b.) -(nv-I l,,

N nII I {i + exP(ei-b)}]Mi-l j=1

The quantity given as L(8,bx),

exp(Iri~i) exp(lqjbj)/ f{1+exp(ei-bj)} = H n exp(Oi-bj)/{l+exp(6i-bj)} ,

and, hence, is bounded. In fact,

ThefL(_e,bJ) i 1 th.

i Therefore, it follows that

1I

-16-

N -(+ve-1)/ 2f I Ip( _, ) ...) I dedb < f [V s82 + N (e .e )2 ] de_

ii-2 n -(n+vb-)/2

"f[ s a + I (b -b)21 db1" I

The integralson the right of the inequality clearly exist since the kernels

are those of multivariate t densities. Hence, the posterior pdf,

p(,b , 6e 2vb,Sb), is a proper pdf although the normalizing constant

cannot be evaluated explicitly.

The joint posterior modes may be taken as estimatesof e i and b

(i-1, ..., N; J-l, ..., n). These are obtained by setting equal to zero

the derivatives of log p(8,b ...), and solving the resulting equations:

n[31] Pij ri - (i-e.)/C (i-1 ..., N),

j=1

N[321 P qj + (b -b.)/a2 (J-1, ... , n),

i=l b

where

Pij = exp(ei-bj)/{l + exp(ei-bj)}

ri M [ xij

Iq M x i i

02 M {V s2 + [ (e-0)21/(ve+N-1)

and

_ {Vb82 + (b-b.))2/(Vb+n-1)

Since the systems of equations is non-linear, the Newton-Raphson procedure

is employed to solve the equations iteratively. In order to accomplish

this, we let

ni 11 i[33] f~ t J- Pt (et-e.)/o - rj-

-17-

andN[34] h(b) - ij (b qj

i-i

Thenn

[35] f'(ei) = ij(-Pij) + { - 2(e-e)/(v+N-1/(a ) 2jz

and

[36) h'(bj) - - [ Pj(1-Pij) - {a(1 - - 2(bb )/(vb+n-l)}/(cb)2i-i

As before, if e(k) and b(k) denote the values of 8 and bj at the kth

iteration, then

[37] 0( k+l ) - 6(k) f(e(k))/ft(e(k))

and

(k+)= (k) - (k) (k)

(o) (0)Starting with initial values 6i (i1, ... , N), and bj (J-1, ... ,

(0)where ei is given by (23], and

b (0) log [(N-q i )/q i ]

6 is estimated. These values of e are then used to obtain revised estimates

of b. This process is repeated with the revised estimates of b being used

to obtain revised estimates of e. The process is terminated when the

convergence criterion is reached. This procedure is not the full Newton-

Raphson procedure and, in this case, is preferred to the full Newton-

Raphson procedure since the latter requires obtaining an inverse of the

-18-

matrix of second derivatives at each stage of the iteration. In practice,

the procedure outlined here converges rather rapidly.

As pointed out earlier, although the equations provided are for

estimating eB(i=1, ..., N), only er(r-l, ..., n-l) need be estimated.

In order to carry this out, the quantities given in Equations [34] and

[35] have to be computed as follows:

N n-lPiJ I Nr PJ (e)

i r=l r

N n-lPIJUl-Pij) Nr P j(er )11-P (er))i~l r=l

where N denotes the number of examinees who obtained raw score r and

Pj( 0 r) exp(er-bj)/ {1 + exp(Or-bj)}

Large Sample Properties of the Posterior Distributions2 s 2

The posterior pdf, p(e,bveVb,sesbx), given by Equation [30] is

a product of the likelihood function and a multivariate "double-t"

distribution. The "double-t" distribution is a product of two multi-

variate t densities (Tiao & Zellner, 1964; Zellner, 1971, p. 101). As

a result of its complex form, properties of the posterior pdf cannot be

obtained. However, it is possible to obtain the asymptotic properties

of the posterior pdf, and this will suffice, in most cases, for inferences

to be drawn regarding the parameters.

Let t be a vector of parameters, and y a vector of observations.

Then, the posterior pdf of t, p(dy), is

-19-

p(tl) -p t) L (yI

where p(t) is the prior distribution of t and L(yjIt), the likelihood

function. Then, for large samples,

[39] p(t) - L(yjt_)

and, in turn, L(yI.t) is approximately multivariate normal centered at

t, the maximum likelihood estimate, with dispersion matrix

[40] E _ [-a2 log L(yjt)/atiatj]-1-

Thus, for large samples,

For a detailed discussion of this result we refer the reader to Jeffreys

(1961, p. 193) and Zellner (1971, p. 32).

This result clearly applies in the present situation when both n

and N, the number of items and the number of examinees, are large.

Denoting the [(n+N)xl] vector [e,b] as

' [ , b']

[411 Llx N N(t, E)

In order to evaluate E, we write

G8 Ge[42] r

Gbe Gb

where

-20-

[431 Ge -3- 2109 L(xL~)3~O~

-t P (1-P )1 6i

where 6mis the Kronecker delta,

[44] Gb {-a 2log L(xiq,b)/3b abrn1,

NI Pi 2 (l-Pim) 62mv

and

[451 G eb = {-32log L(x[-Q,.)/ aeiabj I

p Pi(1-Pij

ij ij

estimate of 01. and variance, a2 , given by the ith diagonal element ofei

[46] a2 = [Ge -GGe e Geb GO GbeP

Similarly, the marginal distribution of bj has me an ithe maximum like-

2lihood estimate of b, and variance, Obj' given the jth diagnonal element

of E, i.e.,

[4] 1 [Gb - Gbe G;1 G 1

-21-

This approximation to the posterior pdf of e and b can be improved

upon if we take into account the "double-t" distribution (see Equation

[30]). For a sufficiently large sample, the multivariate t density ap-

proaches the normal density. Thus, in the expression

N (ve+N-1)/2[Ves2 + I (e -e.)

i=l

if we write veuNke where 0<k 8 <l, for large N, we obtain

2 N ee)2]N(k +l)/2+ (k8+l) N(0 2k T I (ei-e.)21

exp{- _ A11 --

where

(ke+l) 1 iA11 = keS2 [N] N

with IN being the identity matrix and l' = [i 1 ... 11 Similarly,

for large n,

2 n -(vb+n-l)/2 2 n -n(kb+l)/2+4k[491 Ivbsb + I (bj-b.)2] = [nkbsb + I (b-b. )2]

e{-(kb+1) nexpf- -b I(b -b.) 212kbsb jul

= exp(- b A22 b}

where(kb+l) 1

A2 2 2 (1'n n 1

Thus,

&

-22-

[50] [vs 8 2 2 (v b+n-l)I2 2 N - 2-(BNI/

b y (b -b. )2 vese + (e i )2

exp f --1(9A 1 t+ b' A2 2

[51] exp f-t A ti

where

0 ' b'

and

A = A l l 0

0 A2 2

Combining [411 and [51], we have

[52] P_-h~~v~ s , 22b

[53] 2x {(t-.t)' E(t-t) + t' A t)

[54] -exp{- -1 (t -)'T (t - T)

where

[55] T E -

and

[561 T - (E+A)- (Et)*

*This result follows from the fact that

(x-A)'A(x-a) + (xi-bTE(x-b) -(xi-t)'T(x-t) + constant,

where

T -A+B,

andt- (A+B)1' {Aa +Bb)

-23-

If the off diagonal matrix Gab in [42] can be ignored, then

2 1 2[57] 06 Pij(l-Pij)]- + (N-l)(k+l)/NkesoJ-1

and

N[58] a bj [ P(I-Pij ) ] - + (n-i)(kb+l)/nkbsb

The expression [57] is useful when the item parameters are considered

known. Similarly, [58] is applicable when the ability parameters are

known. In general, however, when the item and ability parameters are

estimated simultaneously, the off diagonal matrix, Gab, cannot be

ignored, and hence, in this case, the complete expression given by either

[46] or [55] should be employed. With these results it is possible to

construct "credibility intervals" (Novick & Jackson, 1974) for the

parameters of interest.

COMPARISON STUDIES

In order to study the efficacy of the Bayesian procedure described

above and to compare the Bayesian estimates with the maximum likelihood

estimates, a simulation study was carried out. Although simulation studies

may not be realistic in some situations, they can be justified in the

present context since only through a simulation study can one estimation

procedure be compared with another.

Artificial data, representing the responses of N individuals on n

items, were generated using DATGEN (Hambleton & Rovinelli, 1973) accord-

ing to the one-parameter logistic model. In generating the values of

- -

* . e *°* * * • *..** *..** -l * 4. ** 9 40

-24-

0i and bj (i-l, ... , N; J-1, ... , n), it was assumed that ei and bj were

independently and identically normally distributed with mean, zero, and

variance, unity (we shall return to a discussion of this issue later).

The design of the comparison study was conceptualized in terms of the

following, completely crossed, factors: estimation procedure (Bayesian,

maximum likelihood); number of examinees, N (20, 50); number of items,

n (15, 25, 40, 50). This design was carried out for (i) conditional

estimation of e, and, (ii) joint estimation of 6 and b.

The size of the examinee population, N, and the test length, n,

were chosen to facilitate comparison of the maximum likelihood and the

Bayesian estimates for small sample sizes and short tests, since the large

sample behavior of the maximum likelihood estimates has been studied by

Swaminathan and Gifford (1979). These authors have found that maximum

likelihood estimates of ei and bj approach the true values for N as large

as 200 and n as large as 100. Since for these values of N and n, Bayesian

estimates can be expected to be the same as maximum likelihood estimates,

the study was focused on small values of N and n.

The Bayesian estimates and the maximum likelihood estimates were

compared with respect to accuracy. The two sets of estimates were com-

pared with respect to: (a) the mean value of the estimates, as compared

with the mean value of the true values; (b) the mean squared error

difference between the true values and the estimated values; and, (c) the

regression of the true value on the estimated value.

It may be argued that since the joint modes of the posterior distri-

bution were taken as estimates of the parameters, the criterion employed

to determine the accuracy of the estimates is incompatible with the loss

1- + . . ....+ " -+. .. + z ; . .+ .... X-, .. -.. .. ... ..... .. . .... , '

-25-

function employed to arrive at the estimates. This is a valid argument.

However, we are primarily interested in comparing the Bayesian estimates

with the maximum likelihood estimates. Since, in one sense, the maximum

likelihood estimates can be thought of as the modes of the posterior

distribution derived under the assumption that the prior information is

vague, comparison of two modal estimates using a different criterion

other than that involved in deriving the estimates may be justifiable;

particularly since this will not provide an "unfair" advantage to one

set of estimates.

Comparison of the two estimation procedures in terms of the regres-

sion of true values on the estimates needs some explanation. If T is

the true value of the parameter and E, the estimate, then E(TIE-e) 0 8+Ole.

If OoOand 6 1-1, then, it can be concluded that the estimates are unbiased,

and hence, the departure from the expected values of 00and B1 can be

taken as an indicator of bias. It should be pointed out here that the

classical notion of bias is not critical in Bayesian analyses. Neverthe-

less, comparison of the regression lines will provide a further assessment

of the accuracy of the two procedures.

The comparison of the maximum likelihood (ML) procedure and the

Bayesian procedure for the conditional estimation of ability 8 is provided

in Table 1. The first column contains the means of the true values of 6,

the ML estimates, and the Bayesian estimates. The second column provides

an assessment of accuracy in terms of the mean squared deviation between

the estimate 6 and the true value, O%. The correlations between 8t and

Sfor each estimation procedure is displayed in column four, while the

:regression of 6 t on 6is given in column five.

-26-

cn m- 0% P- r-4 co rU, -t t~0 % %D -4 %D

0

C" c 0 %0 -74 C1 %.7 W)03 0 0 0 0 0 -4 0 0 00. .

S..4 mO 04 %O 1- n.7

0~- 1-4 -4 1; 7 74 .4 -

W.4

N 0 U, 0 %0 0 V740 CN r- N1 %T C.) -74 -.7 N

0 41iI 0 0 7

0.J0 w C4 0 LM 174 co -T r- U,"74 ;P- -4 LM L Go N -T r- Go

0 -W 0 r 0% , a% m% m% 0% ON 0%w cc

w 0)

0ow co wl U, r- .74 In r. Go1-4 54 0 0% ON 0% 0% 0% m% 0%

47

-4

0.7 0 r, cn %.0 Go v. ON ON -4m -- n 00 V) -r r4 N1 0% ON

to 04 H 0 0 0 1-4 .74 0 0.7H 41

%- , '.0 0 9> c 0 N1 .-4 %D0

Aj 4) w r r 0 en~ .r OD c -Ch4 -T .74 V74 -4 IT7 N N N4

-4 "-4

-H 0. N4 0 Mn U, rw) -4 N 00 0% 74 r. U, Ul

10 0o *. *. ..

0

%* * en cn .q4 'Q in

4~~l 101.-N .7

en Go~-- 0% %0%c.-

cn~U , 0 0. -

Al

a)

0 t0

lk

z

-27-

An examination of the correlations between the true values and esti-

mates reveals that, in general, the difference between ML and Bayes

estimates, is negligible for relatively large values of N and n. However,

for small values of N and/or n, the Bayes estimates correlate better

with true values than the ML estimates.

The correlation coefficient, by itself, is not a sufficient

indicator of the accuracy of estimation. Clear differences between the

Bayesian and ML procedures emerge when we examine the other criteria.

In general, the means of the Bayesian estimates, in comparison with

the ML estimates, are closer to the means of the true values. This result

can be anticipated if we examine the estimating equations [19]. The

estimating equations for ML estimates are:

nJX Pij = ri (i-1, ... , N).j-l

The additional term in the Bayesian estimating equations, (6i-e )/o2

contributes to the regression of the estimates towards the mean, and

hence, the Bayesian estimates are closer to the means of the true values.

The only exception occurs with N-20 and n-l, 25. At this point, there

is no explanation for this anomolous result. Further replications are

clearly necessary to establish this point conclusively.

The most dramatic difference between the Bayesian estimates and the

ML estimates is with respect to the mean squared deviations of the esti-

mates from the true values. In general, the mean squared deviations are

much smaller for the Bayesian estimates than for the ML estimates. The

difference is particularly noticable with small N and n. In these cases,

the mean squared deviations for the ML estimates is almost four times as

iL

71 . 4 . 00 0 ' 0

-28-

large as that for the Bayesian estimates. This finding can again be

explained by the fact prior information is most helpful in these cases.

This, together with the regression effect described previously, results

in an increase in the accuracy of the estimation procedure.

An examination of the regressions of true values on estimated

vlues also provides some interesting results. In general, the intercepts

and the slopes of the Bayesian regressions are closer to zero and one

respectively, than the ML regressions. The trend for the intercepts is

reversed for large n. In these cases, the intercepts for the ML regressions

are closer to zero than the intercepts for the Bayesian regressions. This

latter result is interpretable, since the maximum likelihood estimates of

6, for large N and n, approach the true values. However, the trend for

small n and N is rather surprising since, as a result of regression towards

the mean, the Bayesian estimates can be expected to be "biased." The only

explanation for this finding is that the ML procedure is severely biased

for small n and N, even more so than the Bayesian procedure.

The above findings, for conditional estimation of 8, appear to be

valid for the joint estimation of 6 and b (Tables 2 and 3). In fact, the

results for the joint estimation of e and b favor the Bayesian estimates

on all counts for both 0 and b: the means of the estimates are closer

to the means of the true values; the mean squared deviations are much

smaller (in some cases, one-tenth the size of those for ML estimates);

the slopes and intercepts are closer to one and zero respectively (the

only exception occurs for large N and n, in which case, the intercepts

of the ML regression are closer to zero).

* * * *** ~ * * * . .. . e *... e. * ** * * off

-29-

en r4 -4 LM ''-4 0 0 -4 cn 0% 0n7 £

P.fl

00 0a 0. 0 - cn.0 0 0 C40 r

0

ca n

00 .4 M...Pw-4 4 -4 -4 r4 r-4 - .4

0

40 0 N o a~ r'. 00 .-4 r- 00 V4 en f'0 4rl c

.04-u 04.) *1

.0

0(D 0% It 0% 0% i 0% 0% 0% 0

(D0 0

00 0% *n in r4 %0N (a0% IT 0% 0 %0 0%

0 014 C14n

en P. Go N -Z 0% N 0~

a-) 0% 04 0% cn i o r-..

___ V- 4L - 4 4n

(a 93 t ' 7 7 iI i

00 ~ C~ N. N ' 0% ~D10 N -4 ' r4 0 N C'

* S 0 do .

-30-

- % - e~4 0 C4 4 cn4

- - v4 - r-4 9-4 .

&f~ ' U'4 0 -4 .0 '

c q t n r C 4 r- c% r-4 0nk 0q oo 0 v.4a 0% 0 0

V9.

0% 0)0 N 0 C4 4v.o A.0, O 0 r- at n 0% Ln r-4

0 ~~ co4 0% M. v-0 - . .r4 -I 0

r-. ON 0% I- irn C) ~0 1- a 0 0% ON CS % ON Nh m

V 0 LW. ~ v4 .

4JC4

00 CO n C c 0 ren % 0% 0% ON ON~ 0 s 4

41 -4 ~ U ' 0 0 ' -

CdUE-4 0

0v0 P.4 . 0 t*- ON1 O N -0 cc N 0- C)) CV0'0 r-

H00v4 *4 -

0 4) 1.0- 0 N IC:O "- 0 LI 0 Le) 0% m-

0 CU 00 v.4 as v.4 0 1 %t 4 o

01 14 ~ 0 4' f '

41 10 00 -4 "-4 r4 I- '0 ON %0

00

m) IT) 0 o 4 C4) fl- 10 0%' 0 0 0. v.4 m~ LI 0%

CU0 4 0 0 00 0- 0*v-4C . .

0I 0 I 0 0 C

4) 4 U) N " %0 cou0e

v0 Ln -I I - 1 4

o.10-4 0

0

4i 01 LM

'REM

-31-

DISCUSSION

The Bayesian procedure for estimating parameters in the one-parameter

latent trait model is an attractive alternative to the maximum likelihood

procedure. Bayesian procedures are conceptually more appealing since

direct interpretations of probability statements involving the parameters

are possible. Empirically, as the results of the comparison study indi-

cate, the Bayesian estimates of the parameters are superior to the maximum

likelihood estimates in terms of their accuracy.

Although the empirical results demonstrate the effectiveness of the

Bayesian procedure, it may be argued, and correctly, that the simulation of

the data favored the Bayesian procedure. The data were generated to meet

the strong distributional assumptions required by the Bayesian procedure.

In addition, in specifying prior belief about the distribution 00 and 0b*

s2 and s2 were set equal to one with the corresponding v6 and Vb being

specified as 15. In the simulation *e and *b were set at one, and the

specification of s2, s2 and the relatively large values for ye and vb

reproduced the true state of affairs. It is not surprising, therefore,

that the Bayesian procedure proved to be superior to the maximum likeli-

hood procedure.

In fairness to the study, it should be pointed out that the simula-

tion and the accurate specification of prior belief were deliberate in

order to determine the applicability of the Bayesian procedure, at least, under

ideal conditions. Preliminary investigations with non-normal data and

also with poor specification of priors indicate that the Bayesian proce-

dure, being based on a hierarchical model, is relatively robust and is

superior to the maximum likelihood procedure. A detailed study of the

effects of poor specification of priors and departures from underlying

- Mfib&MM"

-32-

assumptions is currently under way and we expect to report these results

in the near future.

Despite the encouraging results obtained, a theoretical problem

still remains with the estimation procedure. The procedure described in

this paper requires the joint estimation of n structural parameters and

N incidental parameters. If N- while n remains fixed, the joint posterior

pdf may not become concentrated about the estimated values. This trend

is evident from Tables 2 and 3; with increasing N, the intercept and

slope do not tend to zero and one respectively. This problem is similar

to the one that exists with maximum likelihood estimates. Although

from a Bayesian point of view asymptotic properties, such as consistency,

are not critical, the lack of them, at least to some degree, is discom-

forting. It appears that this situation can be remedied, if when esti-

mating the n structural, or item, parameters, the ability parameters

are considered nuisance parameters and can be integrated out to yield the

marginal posterior pdf of b. The marginal posterior pdf is currently

not available as a result of the exceedingly complex form of the joint

posterior pdf. 4pproximations, such as the one indicated (Equation [151)

may be employed to simplify the joint posterior pdf. Initial investiga-

tions reveal that this approximation is reasonably good, but further

research in this area is clearly needed.

In summary, we note that the Bayesian procedure developed in this

paper is relatively simple to implement, and computationally as efficient

as the maximum likelihood procedure. Despite the issues raised above,

the Bayesian procedure has the potential for greatly improving the accuracy

of the estimates. Moreover, the maximum improvement in accuracy occurs

-33-

for small values of N and n, a result that can be expected, and this

makes the Bayesian procedure more attractive than the maximum likelihood

procedure.

-34-

References

Andersen, E. B. Asymptotic properties of conditional maximum likelihoodestimates. The Journal of the Royal Statistical Society, Series B,1970, 32, 283-301.

Andersen, E. B. The numerical solution of a set of conditional equations.The Journal of the Royal Statistical Society, Series B, 1972, 34,42-54.

Andersen, E. B. Conditional inference in multiple choice questionnaire.British Journal of Mathematical and Statistical Psychology, 1973,26, 31-44. (a)

Andersen, E. B. A goodness of fit test for the Rasch model. Psychometrika,1973, 28, 123-140. (b)

Birnbaum, A. Statistical theory for logistic mental test models witha prior distribution of ability. Journal of Mathematical Psychology,1969, 6, 250-276.

Bock, R. D. Estimating item parameters and latent ability when re-sponses are scored in two or more nominal categories. Psychometrika,1972, 37, 29-51.

Bock, R. D., & Lieberman, M. Fitting a response model for n dichotomouslyscored items. Psychometrika, 1970, 35, 179-197.

Hambleton, R. K., & Rovineili, R. A FORTRAN IV program for generatingexaminee response data from logistic test models. BehavioralScience, 1973, 18, 74.

Hambleton, R. K., Swaminathan, H., Cook. L., Eignor, D., & Oifford, J. A.Developments in latent trait theory: Models, technical issues, andapplications. Review of Educational Research, 1978, 48, 467-510.

Jeffreys, H. Theory of probability (3rd ed.). Oxford: Clarendon, 1961.

Jensema, C. J. A simple technique for estimating latent trait mental testparameters. Educational and Psychological Measurement, 1976, 36,705-715.

Lindley, D. V. The estimation of many parameters. Foundations of Statis-tical Inference. (V. P. Godambe and D. A. Sprott, Eds.) Toronto:Holt, Rinehart, and Winston, 1971, 435-455.

Lindley, D. V., & Smith, A. F. Bayesian estimates for the linear model.Journal of the Royal Statistical Society, Series B, 1972, 34, 1-41.

-35-

Lord, F. M. An analysis of verbal Scholastic Aptitude Test using Birnbaum'sthree-parameter logistic model. Educational and PsychologicalMeasurement, 1968, 28, 989-1020.

Lord, F. M. Estimation of latent ability and item parameters when thereare omitted responses. Psychometrika, 1974, 39, 247-264.

Lord, F. M., & Novick, M. R. Statistical theories of mental test scores.Reading, MA: Addison-Wesley, 1968. -

Meredith, W., & Kearns, J. Empirical Bayes point estimates of latenttrait scores without the knowledge of the trait distribution.Psychometrika, 1973, 38, 533-554.

Novick, M. R. Bayesian considerations in educational information systems.Proceedings of the 1970 Invitational Conference on Testing Problems.Princeton, NJ: Educational Testing Service, 1971.

Novick, M. R., & Jackson, P. H. Statistical methods for educational andpsychological research. New York: McGraw-Hill, 1974.

Novick, M. R., Lewis, C., & Jackson, P. H. The estimation of proportionsin n groups. Psychometrika, 1973, 38, 19-46.

Owen, R. A. Bayesian sequential procedure for quantal response in thecontext of adaptive mental testing. Journal of the AmericanStatistical Association, 1975, 70, 351-356.

Robbins, H. An empirical Bayes approach to statistics. Proceedings ofthe Third Berkeley Symposium on Math, Statistics, and Probability1, 157. University of California Press, 1955.

Sanathanan, L., & Blumenthal, S. The logistic model and estimation oflatent structure. Journal of American Statistical Association,1978, 73, 794-799.

Samejima, F. Estimation of latent ability using a response pattern ofgraded scores. Psychometric Monograph, 1969, No. 17.

Samejima, F. A general model for free-response data. Psychometric Mono-

graph, 1972, No. 18.

Stein, C. M. Confidence sets for the mean of a multivariate normaldistribution. Journal of the Royal Statistical Society, Series B,1962, 24, 265-296.

Swaminathan, H., & Gifford, J. A. Precision of item parameter estimation.Paper presented at the 1979 Computerized Adaptive Testing Conference.Minneapolis, June 1979.

-36-

Tiao, G. C., & Zellner, A. Bayes' Theorem and the use of prior knowledgein regression analysis. Biometrika, 1964, 51, 219-230.

Urry, V. W. Approximations' to item parameters of mental test models andtheir uses. Educational and Psychological Measurement, 1976, 34,253-269.

Wright, B. D. Solving measurement problems with the Rasch model. Journalof Educational Measurement, 1977, 14, 97-116.

Wright, B. D., & Douglas, G. A. Best procedure for sample-free itemanalysis. Applied Psychological Measurement, 1977, 1, 281-295.

Wright, B. D., & Panchapakesan, N. A procedure for sample-free itemanalysis. Educational and Psychological Measurement, 1969, 29, 23-48.

Zellner, A. An introduction to Bayesian inference in econometrics.New York: Wiley, 1971.

IA

Page 1

DISTRIBUTION LIST

Navy Navy

Dr. Jack R. Borsting 1 Dr. Patrick R. HarrisonProvost & Academic Dean Psychology Course DirectorU.S. Naval Postgraduate School LEADERSHIP & LAW DEPT. (7b)Monterey, CA 93940 DIV. OF PROFESSIONAL DEVELOPMMENT

U.S. NAVAL ACADEMYDr. Robert Breaux ANNAPOLIS, MD 21402Code N-711NAVTRAEQUIPCEN 1 Dr. Norman J. KerrOrlando, FL 32813 Chief of Naval Technical Training

Naval Air Station Memphis (75)Chief of Naval Education and Training Millington, TN 38054

Liason OfficeAir Force Human Resource Laboratory 1 Dr. William L. MaloyFlying Training Division Principal Civilian Advisor forWILLIAMS AFB, AZ 85224 Education and Training

Naval Training Command, Code OOACOMNAVMILPERSCOH (N-6C) Pensacola, FL 32508Dept. of NavyW;ashington, DC '0370 1 Dr. Kneale Marshall

Scientific Advisor to DCNO(MPT)Deputy Assistant Secretary of the Navy OPOT

(Manpower) Washington DC 20370Office of the Assistant Secretary of

the Navy (1Wanpower, Reserve Affairs,1 CAPT Richard L. Martin, USNand Logistics) Prospective Commanding Officer

Washington. DC 20350 USS Carl Vinson (CVN-70)Newport News Shipbuilding and Drydock Co

Dr. Richard Elster Newport News, VA 23607Department of Administrative SciencesNaval Postgraduate School 1 Dr. James McBrideMonterey, CA 93940 Navy Personnel R&D Center

San Diego, CA 92152DR. PAT FEDERICONAVY PERSONNEL R&D CENTER 1 Dr. George MoellerSAN DIEGO, CA 92152 Head, Human Factors Dept.

Naval Submarine Medical Research Lab'Ir. Paul Foley Groton, CN 06340lavy Personnel R&D CenterSan Diego, CA 92152 1 Library

Naval Health Research Center

Dr. John Ford P. 0. Box 85122lavy Personnel R&D Center San Diego, CA 92138San Diego, CA 92152

1 Naval Medical R&D CommandCode 44National Naval Medical CenterBethesda, MD 20014

{I

Page 2

Navy Navy

Ted M. I. Yellen 1 Office of the Chief of Naval OperationsTechnical Information Office, Code 201 Research, Developnent, and Studies BrancNAVY PERSONNEL R&D CENTER (OP-102)SAN DIEGO, CA 92152 Washington, DC 20350

Library, Code P201L 1 Head, Manpower Training and ReservesNavy Personnel R&D Center Section (Op-964D)San Diego, CA 92152 Room 4A478, The Pentagon

Washington, DC 20350Technical DirectorNavy Personnel R&D Center 1 Captain Donald F. Parker, USNSan Diego, CA 92152 Commanding Officer

Navy Personnel R&D Center6 Commanding Officer San Diego, CA 92152

Naval Research LaboratoryCode 2627 1 LT Frank C. Petho, MSC, USN (Ph.D)Washington, DC 20390 Code L51

Naval Aerospace Medical Research LaboratPsychologist Pensacola, FL 32508ONR Branch OfficePldg 114, Section D 1 The Principal Deputy Assistant666 Summer Street Secretary of the Navy (MRA&L)Boston, MA 02211 4E780, The Pentagon

Washington, DC 20350PsychologistONR Branch Office 1 Director, Research & Analysis Division536 S. Clark Street Plans and Policy DepartmentChicago, IL 606.05 Navy Recruiting Command

4015 Wilson Boulevard')ffice of aval Research Arlington, VA 22203Code 437300 N. Quincy SStreet I Dr. Bernard Rimland (03B)Arlington, VA 22217 Navy Personnel R&D Center

San Diego, CA 921525 Personnel '. Training Research Programs

(Code 458) 1 Mr. Arnold RubensteinOffice of Naval Research Naval Personnel Support TechnologyArlington, VA 22217 Naval Material Command (08T244)

Room 104, Crystal Plaza #5Psychologist 2221 Jefferson Davis HighwayONR Branch Office Arlington, VA 203601030 East Green StreetPasadena, CA 91101 1 Dr. Worth Scanland

Chief of Naval Education and TrainingSpecial Asst. for Education and Code N-5

Training (OP-OIE) NAS, Pensacola, FL 32508Rm. 2705 Arlington AnnexWashington, DC 20370

Page 3

Navy Army

Dr. Fobert G. 3mith I Technical DirectorOffice of Chief of Naval Operations U. S. Army Research Institute for theOP-987H Behavioral and Social SciencesWashington, DC 20350 5001 Eisenhower Avenue

Alexandria, VA 22333Dr. Alfred F. SmodeTraining Analysis & Evaluation Group 1 HQ USAREUE & 7th Army

(TAEG) ODCSOPSDept. of the Navy USAAREUE Director of GEDOrlando, FL 32813 APO New York 09403

Dr. Richard Sorensen I DR. RALPH DUSEKNavy Personnel R&D Center U.S. ARMY RESEARCH INSTITUTESan Diego. CA 92152 5001 EISENHOWER AVENUE

ALEXANDRIA, VA 22333W. Gary ThomsonNaval Ocean Systems Center I Dr. Myron FischlCode 7132 U.S. Army Research Institute for theSan Diego, CA 92152 Social and Behavioral Sciences

5001 Eisenhower AvenueDr. Ronald Weitzman Alexandria, VA 22333Code 54 WZDepartment of Administrative Sciences 1 Dr. Michael KaplanU. S. Naval Postgraduate School U.S. ARMY RESEARCH INSTITUTEMonterey, CA 93940 5001 EISENHOWER AVENUE

ALEXANDRIA, VA 22333DR. MARTIN F. WISKOFFNAVY PERSONNEL R& D CENTER 1 Dr. Milton S. KatzSAN DIEGO, CA 92152 Training Technical Area

U.S. Army Research InstituteMr John H. Wolfe 5001 Eisenhower AvenueCode P310 Alexandria, VA 22333U. S. Navy Personnel Research and

Development Center 1 Dr. Harold F. O'Neil, Jr.San Diego, CA 92152 Attn: PERI-OK

Army Research Institute5001 Eisenhower AvenueAlexandria, VA 22333

1 DR. JAMES L. RANEY

U.S. ARMY RESEARCH INSTITUTE5001 EISENHOWER AVENUEALEXANDRIA, VA 22333

1 Mr. Robert RossU.S. Army Research Institute for the

Social and Behavioral Sciences5001 Eisenhower AvenueAlexandria, VA 22333

I

Page 4

Army Air Force

Dr. Robert Sasnor 1 Air Force Human Resources LabU. S. Army Research Institute for the AFHRL/MPD

Behavioral and Social Sciences Brooks AFB, TX 782355001 Eisenhower AvenueAlexandria, VA 22333 1 U.S. Air Force Office of Scientific

ResearchCommandant Life Sciences Directorate, NLU3 Army Institute of Administration Bolling Air Force BaseAttn: Dr. Sherrill Washington, DC 20332FT Benjamin Harrison, IN 46256 -

1 Air University LibraryDr. Frederick Steinheiser AUL/LSE 76/443U. S. Army Reserch Institute Maxwell AFB, AL 361125001 Eisenhower AvenueAlexandria, VA 22333 1 Dr. Earl A. Alluisi

HQ, AFHRL (AFSC)Dr. Joseph Ward Brooks AFB, TX 78235U.S. Army Research Institute5001 Eiserhower Avenue 1 Dr. Genevieve HaddadAlexandriE:, VA 22333 Program Manager

Life Sciences DirectorateAFOSRBolling AFB, DC 20332

1 Dr. Ross L. Morgan (AFHRL/LR)Wright -Patterson AFBOhio 45433

1 Research and Measurment DivisionResearch Branch, AFMPC/MPCYPRRandolph AFB, TX 78148

1 Dr. Malcolm ReeAFHRL/MP

Brooks AFB, TX 78235

1 Dr. Marty Rockway (AFHRL/TT)

Lowry AFBColorado 80230

1 Dr. Frank SchufletowskiU.S. Air ForceATC/XPTD

Randolph AFB, TX 78148

1 Jack A. Thorpe, Maj., USAF

Naval War CollegeProvidence, RI 02846

Page 5

Air Force Marines

Dr. Joe Ward, Jr. 1 H. William GreenupAFHRL/1PMD Education Advisor (E031)Drooks AFB, TX 78235 Education Center, MCDEC

Quantico, VA 22134

1 Director, Office of Manpower UtilizationHO, Marine Corps (MPU)BCB, Bldg. 2009Quantico, VA 22134

1 Headquarters, U. S. Marine CorpsCode MPI-20Washington, DC 20380

1 Major Michael L. Patrow, USMCHeadquarters, Marine Corps(Code MPI-20)Washington, DC 20380

1 DR. A.L. SLAFKOSKYSCIENTIFIC ADVISOR (CODE RD-1)HO, U.S. MARINE CCRPSWASHINGTON, DC 20380

Page 6

CoastGu ard Other DoD

Chief, Psychological Reserch Branch 12 Defense Documentation CenterU. S. Coast Guard (G-P-1/2/TP42) Cameron Station, Bldg. 5Washington, DC 20593 Alexandria, VA 22314

Attn: TCMr. Thomas A. WarmU. S. Coast Guard Institute 1 Dr. Dexter FletcherP. 0. Substation 18 ADVANCED RESEARCH PROJECTS AGENCYOklahoma City, OK 73169 1400 WILSON BLVD.

ARLINGTON, VA 22209

1 Dr. William GrahamTesting DirectorateMEPCa4/MEPCT-PFt. Sheridan, IL 60037

I Director, Research and DataOASD(MRA&L)3B919, The PentagonWashington, DC 20301

1 Military Assistant for Training andPersonnel Technology

Office of the Under Secretary of Defensefor Research & Engineering

Room 3D129, The PentagonWashington, DC 20301

1 MAJOR Wayne Sellman, USAFOffice of the Assistant Secretary

of Defense (MRA&L)3B930 The PentagonWashington, DC 20301

Page 7

Civil Govt Non Govt

Dr. Susan Chipman 1 Dr. Erling B. Andersen

Learning and Development Department of StatisticsNational Institute of Education Studiestraede 61200 19th Street NW 1455 CopenhagenWashington, DC 20208 DENMARK

Dr. Lorraine D. Eyde 1 1 psychological research unitPersonnel R&D Center Dept. of Defense (Army Office)Office of Personnel Management of USA Campbell Park Offices1900 EStreet !.,W Canberra ACT 2600, AustraliaWashington, D.C. 20415

1 Dr. Alan BaddeleyJerry Lehnus Medical Research CouncilREGIONAL PSYCHOLOGIST Applied Psychology UnitU.S. Office of Personnel Management 15 Chaucer Road230 S. DEAR.ORN STREET Cambridge CB2 2EFCHICAGO, IL 60604 ENGLAND

Dr. Andrew R. Molnar 1 Dr. Isaac BejarScience Education Dev. Educational Testing Service

and Research Princeton, NJ 08450National Science FoundationWashington, DC 20550 1 Dr. Werner Birke

DezWPs im StreitkraefteamtPersonnel R&P Center Postfach 20 50 03Office of Personnel Managment D-5300 Bonn 21900 F Street NW WEST GERMANYWashington, DC 201115

1 Dr. R. Darrel BockDr. H. Wallace Sinaiko Department of EducationProgram Director University of ChicagoManpower Research and Advisory Services Chicago, IL 60637Smithsonian Institution301 North Pitt Street 1 Dr. Nicholas A. BondAlexandria, VA 22314 Dept. of Psychology

Sacramento State CollegeDr. Vern W. Urry 600 Jay StreetPersonnel R&D Center Sacramento, CA 95819Office of Personnel Management1900 E Street NW 1 Dr. Robert BrennanWashington, DC 20415 American College Testing Programs

P. 0. Box 168Dr. Joseph L. Young, Director Iowa City, IA 52240Memory & Cognitive ProcessesNational Science Foundation 1 DR. C. VICTOR BUNDERSONWashington, DC 20550 WICAT INC.

UNIVERSITY PLAZA, SUITE 101160 SO. STATE ST.OREM, UT 84057

Page 8

Non Govt Non Govt

Dr. John P. Carroll 1 Dr. A. J. EschenbrennerPsychometric Lab Dept. E422, Bldg. 81Univ. of No. Carolina McDonnell Douglas Astronautics Co.Davie Hall 013A P.O.Box 516Chapel Hill, NC 27514 St. Louis, MO 63166

Charles Myers Library 1 Dr. Leonard FeldtLivingstone House Lindquist Center for MeasurmentLivingstone Road University of Iowa

Stratford Iowa City, IA 52242

London E15 2LJENGLAND 1 Dr. Richard L. Ferguson

The American College Testing Program

Dr. Kenneth E. Clark P.O. Box 168

College of Arts & Sciences Iowa City, IA 52240

University of RochesterRiver Campus Station 1 Dr. Victor FieldsRochester, NY 14 27 Dept. of Psychology

Montgomery CollegeDr. Plorman Cliff Rockville, MD 20850Dept. of PsychologyUniv. of So. California 1 Univ. Prof. Dr. Gerhard Fischer

Univ,rsity Park Liebiggasse 5/3

Los Angeles, CA 90007 A 1010 ViennaAUSTRIA

Dr. William E. CoffmanDirector, Iowa Testing Programs 1 Professor Donald Fitzgerald334 Lindquist Center University of New EnglandUniversity of Iowa Armidale, New South Wales 2351Iowa City, IA 52242 AUSTRALIA

Dr. Meredith P. Crawford 1 Dr. Edwin A. FleishmanAmerican Psychological Association Advanced Research Resources Organ.

1200 17th Street, N.W. Suite 900Washington, DC 2C036 4330 East West Highway

Washington, DC 20014Dr. Hans CrombagEducation Research Center 1 Dr. John R. Frederiksen

University of LeyJen Bolt Beranek & NewnanFoerhaaveloan 2 50 Moulton Street2334 EN Leyden Cambridge, MA 02138The NETHERLANDS

1 DR. ROBERT GLASER

LCOL J. C. Eggenberger LRDCDIRECTORATE OF PERSONNEL APPLIED RESEARC UNIVERSITY OF PITTSBURGH

NATIONAL DEFENJCE 11Q 3939 O'HARA STREET

101 COLONEL BY DRIVE PITTSBURGH, PA 15213

OTTAWA, CANADA ',IA OK2

...

Page 9

Non Govt Non Govt

Dr. Daniel Gopher 1 Dr. Douglas H. JonesTndustrial & Management Engineering Rn T-255Technion-Israel Institute of Technology Educational Testing ServiceHaifa Princeton, NJ 08450ISRAEL

3 Journal Supplement Abstract ServiceDr. Ron Hambleton American Psychological AssociationSchool of Education 1200 17th Street N.W.University of Massechusetts Washington, DC 20036Amherst, MA 01002

1 Professor John A. KeatsDr. Chester Harris University of NewcastleSchool of Education AUSTRALIA 2308University of CaliforniaSanta Parbara, CA 93106 1 Dr. Stephen Kosslyn

Harvard UniversityGlenda Greenwald, Ed. Department of Psychology"Humrn Intelligence Newsletter" 33 Kirkland StreetF. 0. Box 1163 Cambridge, MA 02138F~irmingham, rMI '8012

1 Mr. Marlin Kroger

Dr. Lloyd Humphreys 1117 Via GoletaDepartment of Psychology Palos Verdes Estates, CA 90274University of IllinoisChampaign, IL 61E20 1 Dr. Alan Lesgold

Learning R&D CenterLibrary University of PittsburghPumRRO/Western Division Pittsburgh, PA 1526027157 Berwick DriveCarmel, CA 93921 1 Dr. Michael Leviner.te210 Education Building

Dr. Steven Hunka University of IllinoisDepartment of Education Champaign, IL 61820University of AlbertaEdmonton, Alberta 1 Dr. Charles LewisCANADA Faculteit Sociale Wetenschappen

Rijksuniversiteit GroningenDr. Earl Hunt Oude BoteringestraatDept. of Psychology GroningenUniversity of kishington NETHERLANDSSeattle, WA 98105

1 Dr. Robert LinnDr. Huynh Huynh College of EducationCollege of Education University of IllinoisUniversity of South Carolina Urbana, IL 61801Columbia, SC 29208

1 Dr. Frederick M. LordEducational Testing ServicePrinceton, NJ 08540

I.._

Page 10

Non Govt No n G vt

Dr. Jaries Lumsden 1 Dr. James A. PaulsonDepartm ent of Psycho~logy Portland State University

University of Western Australia P.O. Box 751Nedlands W.A. 6009 Portland, OR 97207AUSTRALIA

1 MR. LUIGI PETRULLO

Dr. Gary Marco 2431 N. EDGEWOOD STREETEducational Testing Service ARLINGTON, VA 22207Princeton, NJ 08450

1 DR. DIANE M. RAMSEY-KLEE

Dr. Scott Maxwell R-K RESEARCH & SYSTEM DESIGNDepartnent of Psychology 3947 RIDGEMONT DRIVE

Univernity of Houston MALIBU. CA 90265Houston, TX 77004

1 H1INRAT M. L. RAUCH

Dr. Samuel T. Mayo P II 11Loyoln University of Chicago BUNDESMINISTERIUM DER VERTEIDIGUNG

q20 North Michigan Avenue POSTFACH 1328Chicago, IL 60611 D-53 BONN 1. GERMANY

Dr. M.Irk Mliller I Dr. Mark D. ReckaseComputer Science Laboratory Educational Psychology Dept.Texas Instruments, Inc. University of Missouri-ColumbiaMail Station 371, P.O. Box 225936 4 Hill HallDallas, TX 75265 Columbia, MO 65211

Professor Jason ?illman 1 Dr. Andrew M. Rose

Department of Education American Institutes for ResearchStone Hall 1055 Thomas Jefferson St. NWCornell University Washington, DC 20007ithaca. NY 14P53

1 Dr. Leonard L. Rosenbaum, Chairman

Dr. Allen Vunro Department of Psychology

Pehavioral Technology Laboratories Montgoery College

1P,15 Elena Ave., Fourth Floor Rockville, MD 20850Redonjo Bea ch, CA 90277

1 Dr. Lawrence Rudner

Dr. Melvin R. Novick 403 Elm Avenue

356 Lindquist Center for Measurment Takoma Park, MD 20012

University of IowaIowa City, IA 522112 1 Dr. J. Ryan

Department of Education

Dr. Jesse Orlansky University of South Carolina

Institute for Defense Analyses Columbia, SC 29208400 Army Navy DriveArlington, VA 22'n2 1 PROF. FUMIKO SAMEJIMA

DEPT. OF PSYCHOLOGYUNIVERSITY OF TENNESSEEKNOXVILLE, TN 3716

Page 11

Non Govt Non Govt

DR. ROBERT J. SEIDEL 1 Dr. Brad SympsonINSTRUCTIONAL TECHNOLOGY GROUP Psychometric Research Group

HUMRRO Educational Testing Service300 N. WASHINGTON ST. Princeton, NJ 08541ALEXANDRIA, VA 22314

1 Dr. Kikumi TatsuokaDr. Kazuo Shigemasu Computer Based Education ResearchUniversity of Tohoku LaboratoryDepartment of Educational Psychology 252 Engineering Research LaboratoryKawauchi, Sendai 980 University of IllinoisJAPAN Urbana, IL 61801

Dr. Edwin Shirkey 1 Dr. David ThissenDepartment of Psychology Department of PsychologyUniversity of Central Florida University of KansasOrlando, FL 32816 Lawrence, KS 66044

Dr. Richard Snow 1 Dr. J. UhlanerSchool of Education Perceptronics. Inc.Stanford University 6271 Variel AvenueStanford, CA P 1305 Woodland Hills, CA 91364

Dr. Kathryn T. Spoehr 1 Dr. Howard WainerDepartment of Psychology Bureau of Social SCience ResearchBrown University 1990 M Street, N. W.Providence, RI 02912 Washington, DC 20036

Dr. Robert Sternberg 1 Dr. David J. WeissDept. of Psychology N660 Elliott HallYale University University of MinnesotaPox 11A, Yale Stition 75 E. River RoadNew Haven, CT (1520 flinneapolis, MN 55455

DR. PATRICK SUPPES 1 DR. GERSHON WELTMANINSTITUTE FOR MATHEMATICAL STUDIES IN PERCEPTRONICS INC.

THE SOCIAL ",CIENCES 6271 VARIEL AVE.STANFORD UNIVERSTTY WOODLAND HILLS, CA 91367STANFORD, CA 9005

1 DR. SUSAN E. WHITELYDr. Iariharan Swiminathan PSYCHOLOGY DEPARTMENTLaboratory of Psychometric and UNIVERSITY OF KANSAS

Evaluation Research LAWRENCE, KANSAS 66044School of EducationUniversity of M;-jsachusetts 1 Wolfgang WildgrubeAmherst, MA 01007 StreiJ'.raefteamt

Box 20 50 03D-5300 Bonn 2WEST GERMANY

Page 12

Non Govt

1 Dr. J. Arthur WoodwardDepartment of PsychologyUniversity of California