36
DOCUMENT RESUME ED 348 399 TM 018 884 AUTHOR Junker, Brian W. TITLE A Note on Recovering the Ability Distribution from Test Scores. INSTITUTION Carnegie Mellon Univ., Pittsburgh, PA. Dept. of Statistics. SPONS AGENCY Office of Naval RPsearch, Arlington, VA. Cognitive and Neural Sciences Div. REPORT NO ONR/CS-92-1 PUB DATE Aay 92 CONTRACT N00014-91-J-1208 NOTE 36p. PUB TYPE Reports - Eval,uative/Feasibility (142) EDRS PRICE MF01/PCO2 Plus Postage. DESCRIPTORS Computer Simulation; *Equations (Mathematics); *Estimation (Mathematics); Item Bias; Item Response Theory; *Mathematical Models; *Scores; *Statistical Distributions; Test Length IDENTIFIERS *Ability Estimates; Local Independence (Tests); Population ParaMeters; Smoothing Methods; *Theta Estimates ABSTRACT A simple scheme is proposed for smoothly approximating the ability distribution for relatively long tests, assuming that the item characteristic curves (ICCs) are known or well estimated. The scheme works for a general class of ICCs and is guaranteed to completely recover the theta distribution as the test length increases. The proposed method of estimating the ability distribution is robust to some violations of local independence. After an initial function inversion, the scheme can be inexpensively used to recover the theta distribution in Peon of several different administrations of the same test or several subpopulations in one test administration. Moreover, this appi:oach could be used to recover the distribution of a dominant ability dimension when local independence fails. The scheme provides a starting place for diagnostics concerning assumptions about the shape of the theta distribution or ICCs of a particular test. Work is currently under way to further examine and refine these methods using essentially unidimensional simulation data and to apply the estimator to real tests. Kernel smoothing is also considered. A 16-item list of references, 10 tables, 6 graphs, and 2 appendixes that provide details of the simulation and procfs are included. (RLC) Reproductions supplied by EDRS are the best that can be made from the original document. It**********************************************************************

DOCUMENT RESUME TM 018 884 AUTHOR Junker, Brian W ...DOCUMENT RESUME ED 348 399 TM 018 884 AUTHOR Junker, Brian W. TITLE A Note on Recovering the Ability Distribution from Test Scores

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

  • DOCUMENT RESUME

    ED 348 399 TM 018 884

    AUTHOR Junker, Brian W.TITLE A Note on Recovering the Ability Distribution from

    Test Scores.INSTITUTION Carnegie Mellon Univ., Pittsburgh, PA. Dept. of

    Statistics.SPONS AGENCY Office of Naval RPsearch, Arlington, VA. Cognitive

    and Neural Sciences Div.REPORT NO ONR/CS-92-1PUB DATE Aay 92CONTRACT N00014-91-J-1208NOTE 36p.PUB TYPE Reports - Eval,uative/Feasibility (142)

    EDRS PRICE MF01/PCO2 Plus Postage.DESCRIPTORS Computer Simulation; *Equations (Mathematics);

    *Estimation (Mathematics); Item Bias; Item ResponseTheory; *Mathematical Models; *Scores; *StatisticalDistributions; Test Length

    IDENTIFIERS *Ability Estimates; Local Independence (Tests);Population ParaMeters; Smoothing Methods; *ThetaEstimates

    ABSTRACTA simple scheme is proposed for smoothly

    approximating the ability distribution for relatively long tests,assuming that the item characteristic curves (ICCs) are known or wellestimated. The scheme works for a general class of ICCs and isguaranteed to completely recover the theta distribution as the testlength increases. The proposed method of estimating the abilitydistribution is robust to some violations of local independence.After an initial function inversion, the scheme can be inexpensivelyused to recover the theta distribution in Peon of several differentadministrations of the same test or several subpopulations in onetest administration. Moreover, this appi:oach could be used to recoverthe distribution of a dominant ability dimension when localindependence fails. The scheme provides a starting place fordiagnostics concerning assumptions about the shape of the thetadistribution or ICCs of a particular test. Work is currently underway to further examine and refine these methods using essentiallyunidimensional simulation data and to apply the estimator to realtests. Kernel smoothing is also considered. A 16-item list ofreferences, 10 tables, 6 graphs, and 2 appendixes that providedetails of the simulation and procfs are included. (RLC)

    Reproductions supplied by EDRS are the best that can be madefrom the original document.

    It**********************************************************************

  • D.- 4,

    -"so vo

    , .41.1*

  • A Note on Recoveringthe Ability Distribution

    from Test Scoresby

    Brian W. Junker

    Department of StatisticsCarnegie Mellon University

    Pittsburgh, PA 15213

    May 1992

    Technical Report ONR/CS 02-1

    Prepared for the Model-Based Measurement Program. Cognitive and NeuralSciences Division. Office of Naval Research, under grant number N00014-91-J-1208. RIT 4421-560. Approved for public release; distribution unlimited.Reproduction in whole or in part is permitted for any purpose of the UnitedSt.qes Government.

    3

  • REPORT DOCUMENTATION PAGEfarm Appia4V41CMS Pi& o1044118

    New IMMIMM MOM* MIF TM MMIMM Of iMOMIIMIM ot I1SMEMS10 Onifte. 4 *Out DM vesamver. sioncaso Ithe stave let reivemov westgaseas.IMMO MIMMOMM

    ementorM ANNO19 MU legglomononis sr sea aver me s s s ram

    ais M00% 411$ ArtirsonDC U. ,

    9ilevisues ow strowiammq no ems ovvele. amcanntaa at IMIeRMIMea. sIOCIWWWp 00119111111MS tatDan armor. loot 1204. attaatrat. vA 22.222,4303.

    COMODIPOOO MO MMOmMOMIPMMMISSOP 91.9M0M010. conassatimmalinpatataaws vat ataaatt.to tvanotatee yovemea1nov lavvons. DIM(jOPOIN tor owenosspom 000111sorna

    t01100 Olotwo mimememNW oe oft sweet wtta---t.trari gamtuattprows01011411111.11ffiessatos.

    1. ARMY USE ONLY (Wye stow 12. REPORT DAT(I 20 May 1992

    1 3. REPORT TYPE AND DATIS COMMI Technical

    4. TITLE AND SUBTITLE

    A Note on Recovering the Ability Distribution frma

    Test Stores.

    S. FUNGUS

    G: N00014PE: 61153NPRI 304204

    NUMBELS

    -91 -.3 -1208

    ORGANQATKIN

    92-1

    a. AUTHOR(S)

    Brian W. Junker

    TA: ER04204-01WIT; 4421-560-4

    a PERRNIMINGMUDONTIMMER

    011iCS

    7. ratteasiaaa oaaarazanos Nahum aND apoasis4u)

    Department of StatisticsCarnegie Mellon University .

    Pittsburgh, PA 15213

    9. SPONSORING/MONITORING AGENCY MAWS/ AND ADENUESS(ES)

    Cognitive Science ProgramOffice of Naval Research (Code 1142C5)800 N. Quiney StreetArlington, VA 22217-5000

    1 . SPONSORING/MONITORINGAGENCY REPORT NUMBER

    11. SUPPLEMENTARY NOTES

    Submitted for publication

    12a. DISTRIBUTION t AVAILABILITY STATEMENT

    Approved for public release; distribution 111,14m4ted.

    12b. DISTRIBUTION CODE

    13. ABSTRACT (Maxemum100 wows)

    We propose a simple scheme for smoothly approximating the abilityfor relatively long tests, assuming that the ICC's are known orThe scheme works for quite a general class of item characteristicand is guaranteed to completely recover the 6 distribution as the.7, grows. After au initial function inversion, the scheme can beused to recover the e distribution in each of several differentof the same test (or subpopulations in one test administration).approach could be used to recover the distribution of a dominantwhen local independence fails. Finally, the scheme provides a startingdiagnostics concerning assumptions about the shape of the e distributionof a particulsr test. Work is currently underway to further examinethese methods using essentially unidimensional simulatimi,data,estimators to real tests.

    distributionwell estimated.

    curves (ICC's)test length.inexpensively

    administrationsMoreover, thisability dimension

    place fororICC's

    and refineand to apply the

    ...-------..iS. NUMBER OF PAGES

    27

    ...------..4. SUBJECT TERMS

    Item response theory, kernel smoothing, latent traitdistribution, population assessment.

    16. PRICE CODE

    17. SECURITY CLASSIFICATIONOF REPORT

    UNCLASSIFIED

    1 . SECURITY CLASSWICATIONOF THIS F AUUNCLASSIFIED

    .9. SECURITY CLASSIFICA IONOF ABSTF.ACT

    UNCLASSIFIED

    2.3. LiMITATION Of ABSTRACT

    UL

    _

    VSN 15»C.,-01-Z80-5500 ,tanaara i-orm er -*--wmipes et, N SA Sod /3.I5?WIG,

    1

  • AbstractWe propose a simple scheme for smoothly approximating the abslitr distribu-

    tion for relatively long tests, assuming that the ICC's are known or well estimated.-The scheme works for quite a maul class of item characteristic turves (ICC's)and is guaranteed to completely recover the 9 distlibution an the test length,.1, grows. After an initial function inversice, the scheme can be inexpensivelrused to reamer the ê distribution in each of several different administrinionsof the simie test (or subpopulations in one test administratkm). Mereevert thisapproach could be used to recover the distnbution of a ciaminant abilitY dimen-sion when local independence fails. Finally, the scheme provides a starting placefor diagnostics concerning assumptions about the shape of the 0 distribution orICC's of a particular test. Work is currently underway to further examine an&refine these methods using estumtially unidimensional simulation data, and toapply the estimators to real. tests.

    Keyword= Item response theory, kernel smoothing, latent trait distribution,population assessment.

    The work reported here was initiated under the direction of Paul Rolland. while Junher walil a participantin the Educational Testing Service Sumner Predoctoral Research Program. Initial computer simulationswere performed by Dorothy Thayer at ETS; the simulations reported here were performed by Junker at th'.$University of Illinois and Carnegie Mellon University.

  • Junker: Recovering the Ability Distribution 2

    I. The basic estimator

    A principal application of educational testing is inferring the distribution of abilities in

    various populations. This task is important for both users of these tests (in, say, comparing

    various subpopnlations) and researchers and test developers (in, say, developing or using

    item calibrationICC parameter estimationprocedures within the IRT framework).

    Inference about the ability distribution from item response data goes back at least to

    Lord (1953) who gives an interesting qualitative account of the possible distortions induced

    by the traditional IRT model. With the rise in popularity of item response theory, IRT,

    many methods for estimating the latent distribution have been developed.

    Samejima and Livingsten (1979) fit polynomials to latent densities using the method of

    moments. Samejima (1984) also fits 0 densities, given the MLE Ô, using specific parametric

    familie. by matching two or more moments. Levine (1984, 1985) projects the (unknown)

    latent distribution onto a convenient function space in the span of the test's conditional

    likelihood functions and estimates the projection by maximum likelihood. Mislevy (1984)

    assumes that the ability distribution is well approximated by a collection of masses centered

    at points placed a priori along the 0 axis and estimates the sizes of the masses at each

    point. More generally, hierarchical and/or empirical Bayes techniques may be used to esti-

    mate parameters of the latent trait distribution if it belongs to a tractable family of priors.

    These methods all rely upon local independence for their validity; moreover they tend to be

    expensive in terms of computation and storage.

    We will examine a simpler method of estimating the ability distribution which, in addi-

    tion, is robust to some violations of local independence. Consider a set of .1 binary items

    that may be embedded in a longer sequence or pool of items (X1, X2, X3, . .). Let e be the

    latent trait of interest, let P1 (0), P2(0), . . , Pj(0) be the item characteristic curves, ICC's,

  • Junker: Recovering the Ability Distribution 3

    with respect to 0, and denote averages of items as Xj = Eif xi, and similarly for averages

    -15j(0) of ICC& Under the usual local independence (LI) and monotonicity (M) conditions

    of item response theory (e.g. liambleton, 1989), or more generally under Stout's (1990)

    formulation of essential independence (EI) and local asymptotic discrimination (LAD), we

    know that aj(Xj) ss. 7531(XJ) is a plausible point estimate of 8: ej(L) is a consistent

    estimator of e under either set of assumptions. It immediately follows that the distribution

    of &J(L)

    FAO = t]

    converges to that of 0 as well (e.g. Serfiing, 1980, p. 19). Now consider administering the

    test Xj to N examinees, obtaining N response vectorsXL,' 2fArj and corresponding 0

    estimates aj(,1i j), .N.7); a natural estimator of the 8 distribution is the "empirical"

    distribution of these aj's

    :1pAr.,(t) :4110,GLJ)5t)

    {fraction of 01/MLA's 5. t}

    where the "indicator function" ls takes the value.1 if S is true and 0 if S is false.

    (1)

    Theorem I Suppose (X1, X2, ...) is a sequence of items and 0 is a latent trait such that

    EI and LAD hold. Define j 3(2( i) as above. If the distribution function

    F(t) = Pfe < tl

    is continuous, the empirical distnbution function PN.J(t) defined in (I), converges in proba-

    bility to F at each t as both J -0 co and N o o.

    As with the work of Stout (1990) and Junker (1991), the embedding in an infinite-length

    item pool is partly a conceptual tool. In practice, one might check the EI condition using

    Stout's (1987) test, and check the LAD condition by verifying that the average ICC for a

    particular test was an invertible function.

  • Junker: Recovering the Ability Distribution 4

    In fact, the full strength of the LAD condition is not needed here. A weaker condition

    that also gives the theorem is that, for all t2 > t1 there exists c(t1, t2) such that

    Urn infi5j(C2) 7.7(ti) > f(tlt i2) (2)J

    Similarly, the full strength of te EI condition is nct needed. It suffices to have, for all t,

    lim Vat (Xj10 = t) = 0 (3)

    Under the weaker conditions (2) and (3), the consistency of P.71(X1) as a. point estimate

    for 0 may fail, but Theorem I still goal through The proof of Theorem I is ased on a

    well-known exponential bound due to Dvoretsky, Kiefer and Wolfowitz (Settling, 1980, p.

    59) on the error made in approximating Fj(t) with PN,j(t). See Appendix B for some details.

    2 Two practical considerations

    Note that the theorem does not in any way require that the ICC's have 0 and 1 as lower and

    upper asymptotes. For example, if Pj has a lower asymptote c, i.e.,

    liminfPJ(t) > c > 0, Vt E-..00

    there certainly could be positive probability that some Sa's have X j < c. The only rea-

    sonable thing for TT' to do with such an Xj is send it to oo, which ruins the estimate of

    F.

    But for any fixed 0, if e < iirn infj, RAO),

    lim sup PI:Vj < elI 4..co

    00

    Um sup f PtXj ele = tldF(t)J ..ao 00

    cv,

    < Um supf P[XJ pi(o)le = ticiF(t)00

    F(0),

  • Junker: Recovering the Ability Distribution 5

    after observing that 11j 5_ P3(0)10 t} 1{e

  • Junker: Recovering the Ability Distribution 6

    Theorem 2 Suppose XII X2, and 0 are as in Theorem I with ICC's P1(t), P2(07

    with average TAO as before. and suppose

    R1(t), R2(t), .

    are another set of ICC's. unth average irij(t). Let15:7' and R be the corresponding inverses,

    and let

    aj(ei) =

    Fix 0 such that 173:71R3(0) has a finite limit r(0). Then

    FJ(0) = Prib(&) 5- 91 F(r(17))

    (where F is the distribution of 0). If these hypotheses hold for every 9 and if r and F arecontinuous functions, then the convergence is uniform in 9.

    The existence of the limit r(0) is a technical requirement that, like LAD, is innocuous in

    the context of real, finite length tests. The most useful interpretation of Theorem 2 is that

    IF3(0) FP5:-1Ti3(0)}1 0

    as J oo, i.e., the distribution of 0 is estimated with a distortion 7534R3.. This followsfrom the theorem if F is continuous at r(9).

    The proof of Theorem 2 expands on the technique used to prove convergence of FAO) to

    F(0); see Appendix B. Just as in Theorem I it is also possible to show that the empirical

    distributions1

    PN.Atin=1

    converge to F(r(0)).

    The value of Theorem 2 is that if the function Pi-'(-173(6)) can be (partially) identified,

    then the distribution of 9 j can still tell us a lot about the underlying 8 distribution. For

  • Junker: Recovering the Ability Distribution

    example, if the "true ICC's" are P(0) and the 0 distribution is recovered with "estimated

    R (0), with the estimated ICC's satisfying

    frij(0) Rj(0)1 0

    as J oo, then the estimated distributions Fj will converge to the true distribution F of

    0, as long as the derivative P(0) is bounded away from zero at each U as J 0 co (this is

    guaranteed by LAD for example).

    Some knowledge of the underlying 0 distribution may even be available when the "true

    ICC's" P.,(0) and the "recovery ICC's" RAO) do not match up asymptotically. For exam-

    ple, it is easy to check numerically that for 'typical" parameter values, averages of logistic

    ICC's are themselves approximately logistic (with parameters approximately the averages of

    the discrimination and difficulty parameters of the individual ICC's). Thus for example if

    the PAO) are Rasch (one-parameter logistic) and the estimation method for the "difficulty

    parameters" bi is !mown, on average, to bias the k by some fixed but unknown additive

    bias parameter (so that logit R1(0) 'Pe logil fl) then roughly 7;1(713(9)) a0 0,

    with, a near 1, so that the location of the 0 distribution will be estimated wrongly but

    the (shape) family to which it belongs may still be identified. Similar considerations apply

    when the P2(0) are 3PL. and the R1(0) are 2PL: over the domain of 771,' (0), 75(L(0)) is

    approximately linear.

    3 Kernel smoothing

    The basic estimator proposed ;n (1) is the "empirical distribution" function

    ky.,i(t)1 +v..

    NI

    = EPN[x.,1{757,4 cil.n

  • 4414F- tit e

    r .t`

    Junker: Recovering the Ability Distribution8

    where

    NPivfx." =j/e1) = v E 1 frst.F=3/nm. 4

    is the natural estimator of the (discrete) distribution of Ari based on N observationsThe indicator function on the far right in (4) may be written

    1 k{771(1/4

  • ;Maker: Recovering the Ability Distribution9

    Third, the observations Ty, ...,XNJ must betransformed by the nonlinear transfor-

    mation 733'. This means that thegranularity changes- over the tinge of e and r.r; this

    complicates practical calculations such as thoseleading to optimal rates for N1.1 and h.--

    We now show that the weighted root mean square error(RMS).betvreen this estimator

    and the true e distribution goes to zero as N,..1 co.The thesmem below is analogous to

    Theorem I.

    Thelon 3 Suppose X1, X2 . .and e are as in Theorem .1 with 1CC's P1(9),P2(0),...

    Define kiqjh(t) as in (5), for a fixed kernel distributionfunction K. Then if the distribution

    function F of 0 is continuous, and K has a finite first absolutemoment,

    14RMS [EN.rh(t) F(t)]2g(t)dt1° - 0

    as N oo,. --+ co and h --- 0, for any density 0).

    (6)

    Unlike most nonparametric density estimation results,there is no restriction on the rates

    at which h 0, N 1. co or .1 oo. This ispartly because a distribution function is

    smoother than, and therefore easier to estimate than, a density.The corresponding technique

    for estimation of the 0 densitywould require h3 to tend to zero moreslowly than gij(Xj)

    (3), for example, as well as further conditions onthe rates at which N and .1 tend to co.

    Despite the fact that there are no rates in the theorem,devising h as a function of N and .1

    to produce the "right" amount of smoothing is animportant issue to which we shall return

    below.

    The proof of Theorem 3 (see Appendix B) is based ondecomposing the RMS in (6) as

    oe

    RMS2 = {P[77(xj) hlt 5_ tll3g(t)dtL40t -7)] g(t)dtr° varK [ h+ (7)

  • 0

    Junker: Recovering the A ;Ality Distribution

    where Y is a random variable with distribution K, independent of 0 and ail item responses.This technique can be modified to show that

    EIPN,m(t) FM? 0 0for any t, and hence ENja(t) F(t) in probability, for each continuity point t of F. Forexample, thif provides another proof that our original estimator EN4 converges in probabilityto F. It would also be clear from the proof that the same smoothing could be applied withany consistent estimator jj in place 757:71(Xj).

    From the decomposition of RMS in (7) into squared-bias and variance terms it seemsthat the optimal h should be more sensitive to J than N. Indeed, when I is small and Nis relatively large, the coarse g7-antlarity inherent in 3331(XJ) should predominate over thefiner granularity inherent in observing N examinees.A workable approach to setting h is to make a quick, crude estimate of the variance of eby assuming that Xj is uniformly distributed on the interval defined by the lower asymptotec and the upper asymptote d of NO) and then applying the formula

    h = C. (Var ern(8)

    which seems appropriate when it- has a variance (Silverman. 1986, pp. 45-48; Reiss, 1981).Our crude estimate of Var 0 is obtained by tabulating values of 161) = 7.7-4((j 1)/(1 + 2))for all j such that c < (j ± 1)/(4.1 + 2) < d, and calculatingVar 0)112 P.- (.7413)(interquartile range)

    (following the relationship between interquartile range and standard deviation for the Normaldistribution). Preliminary trials with C = 1, 1/2.1/3, 1/4 in (8) indicated that C = 1/3produced the best RMS results.There is reason o believe tize, ,..licire of K should not be very influential on the RMS in(6) (Silverman, 1986, pp. 42-43). The K used in 011: dolions was

    t 3K(t) u2) 104

  • Junker: Recovering the Ability Distribution 11

    -1t

    14.(3t t3 + 2) , itt < 1

    1 t > 1

    This choice is conservative about the tails of the e distribution.

    4 Computer simulation

    (9)

    The estimators proposed in Theorems 1 through 3 are less complicated than distribution

    estimators currently in use in IRT. To help evaluate these estimators a pilot simulation

    study was performed. In this simulation, item response data was generated using various

    4 1 parametric moiels, and we attempted to recover the ability distribution using boththe smoothed and unsmootIled estimators.

    Monte Carlo trials: M = 100Examinee samp e size: N = 5,000Ability distribution: Normal N(0, 1,

    Bimodal Mixture 1. /V( 1.5, 1) + i N(1.5, 1)Discontinuous XI I

    Test length: J = 10, 30. 60. 100ICC type: Rasch: Vs equally spaced from -2 to 2

    3PL: Vs equally spaced from -2 to 2(ifs cycling through 0.5, 1.0, 1.5c-'s all set to 0.21

    'Estimated': Generated with the 3PL ICC's above;Estimated with the ICC parameters:

    81.--. N(b1,1/J)cti ,., Mai3O.25)

    max{ N(0.2, 0.1), 0)(all independent).

    Table 1: Monte Carlo simulation parameters.

    The parameters of the pilot simulation are indicated in Table I. All possible combinations

    1_5

  • hve 900 0

    Junker Recovering the Ability Distribution 12

    of these parameters were investigated. The choice of ability distributions was intended to

    examine two °typical" and one "worst case" target distribution. While the standard normal

    distribution is extremely smooth and has a bounded positive density the distribution of the

    shifted chi-squared random variable x1-1 puts no mass below 9 = 1 and the density :lumps

    from 0 to 1-fx) at 0 = 1. (This choice is not intended to be terribly realistic, but allmvs

    us to explore the performance of our distribution estimator under adverse circumstances.)

    Although the means of these distributions are both 0, the chi-squared distribution has twice

    the variance of the normal. The bimodal mixture was chosen to represent a situation where

    two radically different types of examinee take the test. Its standard deviation is also greater

    than 1 (roughly 1.8).

    The ICC's used were all subfainilies of the three parameter logistic (3PL) curves:

    P2(t) = ci (1 ci)[1 expialit

    In the case labelled "Rasch', ai 1,c1 0 and bi are as indicated. The same ICC'swere used to recover F as to generate the data. Indeed ey) is exactly the MLE for 9

    under the Rasch model with known item parameters. Similarly for the 3PL case, whex all

    the parameters were allowed to vary as indicated above; now xp is a somewhat inefficient

    estimator of O. In the case labelled 'Estimated', the 3PL ICC's were used to generate the

    data (113(0)'s in Theorem 2) but then their item parameters were deliberately contaminated

    with noise to produce the -recovery ICCs" (.111(0)'s in Theorem 2) used to estimate F, to

    roughly approximate the practical situation in which item parameters themselves must be

    estimated from data. Thus the cases Rasch, 3PL, and 'Estimated' represent increasingly

    hostile situations for the distribution estimator to work in.

    Finally, the choice of N = 5.000 examinees was somewhat arbitrary. In preliminary runs,

    N = 1,000 and N = 10,000 yielded measures of fit of the estimated ability distribution to the

    true distributke quite comparable to those reported here. The main difference was in the

    variaic of our estimated measures of fit. N = 5,000 was chosen because at that level the

    161

  • Junket.: Recovering the Ability Distribution 13

    variance is much better than at N = 1,000 and not much worse than that at N = 10,000.

    The basic estimators used to compare recovery of F from case to case were the empirical

    distribution function (EDF)

    N= Tsr E ,/pcx,44.0_,st)

    n=1.

    and the kernel distribution estimator (KDE)

    N

    N (t) = KN n=1

    where

    ij(j) (Mal

    =_p_I X + 11+ 2 I

    (and K and h are as described in (8) and (9) above). Each of these distribution estimators

    is consistent for the true e distribution, by application of Theorem 1 through Theorem 3.

    For each simulated data set, sample means and standard deviations for estimates of

    00142RMS = {Ef [Fest(t) F(0139(t)dt}

    are reported. In addition, mean estimates of

    MAX = Etsup{1Fett(t) oo t co)]

    and the average value LOC = tn.,. at which MAX is attained axe reported. (Note: Feat

    stands for either of the distribution estimators above.) In general the weighting function g

    should be chosen to reflect our interests in the 0 distribution F: g should give more weight

    to areas of F that should be well-estimated and less weight to areas of F for which we are

    willing to tolerate less pod estimation. In these simulations, the weighting function g was

    taken to be the standard normal density: some weight is given to estimating F well at all

    O's, but more weight is given to estimating F well near 0 = 0. More details about these

    distances and the methods of calculation can be found in Appendix A below.

    17

  • I 0

    Junker: Recovering the A bility Distribution 14

    TestLength Estimator

    RMSave SD

    DeviationMAX LOC

    10 EDF 0.04655 0.00002 0.11021 0.37694

    KDE 0.02318 0.00003 0.03812 0.89134

    30 EDF 0.01692 0.00001 0.04032 0.09754KDE 0.00887 0.00002 0.01447 0.23184

    60 EDF 0.00984 0.00002 0.02510 0.07844KDE 0.00652 0.00002 0.01076 0.05334

    100 EDF 0.00731 0.00002 0.01895 -0.02856KDE 0.00577 0.00002 0.00965 -0.07616

    Table 2: e Pd N(0, 1 ), Rasch

    TestLength Estimator

    R MS

    ave SDDeviation 1

    MAX LOC 1

    10 EDF 0.07015 0.00002 0.15724 -1.00076K DE 0.05158 0.00003 0.09368 -1.23646

    30 EDF 0.02794 0.00002 0.06418 -0.77476KDE 0.02176 0.00002 0.03755 -1.26626

    60 EDF 0.01521 0.00002 0.03527 -0.46316KDE 0.01251 0.00002 0.02109 -1.05756,

    100 EDF 0.01035 0.00002 0.02463 -0.33196 'KDE 0.00907 0.00003 0.01532 -0.80926

    Table 3: 0 N(0.1), 3PL

    s

    I I I 1* I I II I

  • Junker: Recovering the Ability Distribution 15

    TestLength Estimator

    RMSave SD

    DeviationMAX LOC

    10 EDF 0.09665 0.00004 0.22175 -0.74996KDE 0.08412 0.00004 0 13431 -1.21956

    30 0.05695 0.00004 0.11573 -0.67436KDE 0.05439 0.00004 0.08258 -0.89616

    60 EDF 0.01835 0.00002 0.04188 -0.70396,

    KDE 0.01645 0.00M3 0.02802 -1.10236100 EDF 0.01823 0.00003 0.03782

    .

    -0.49826

    , KDE 0.01767 0.00004 0.02668 -0.79636

    Table 4: e N(0,1), Estimated

    From Tables 2, 3 and 4, it is clear that smoothing in the KDE is helping, especially with

    short tests. In comparing Tables 2 and 3 it is clear that the presence of the nonzero lower

    asymptote c is degrading the fits. This can be seen both in the reduced RMS values and '4n.the movement of LOC, the location of the maximum deviation between Fm and F, toward

    negative values. Finally, comparison of Tables 3 and 4 indicates th using 'noir" ICC'ssomewhat degrades the recovery of F.

    Figure 1 illustrates the performance of the estimators in Table F. The first three panels

    are probability-probability (p-p) plots of the estimated 8 distributior ....ctical axis) apinst

    the true 8 distribution (horizontal axis), for 10, 30 and 60 items. Each panel depicts etteof the 100 Monte Carlo trials for the corresponding line of Table 3 Tht rte-.1 functionsrepresent the EDF estimator and the smooth curve represents the KDE eswaator. Thecloser each is to the solid diagonal line, the better the true probabilitivs the e disSutionare estimated. In particular for 30 or 60 items, estimated probabilities are (;:.'.te close to

    probabilities. The story is very similar for the performance of the estimators it. Tables 2, 5

    and 6 (see also Figure 3). The fourth panel in Figure 1 compares the density derived from

    the KDE estimator in panel three to with the true 49 density (some excessive '4umpinc......

    the estimated density is attributable to the fact that the "window width" h v413 chosen to

    19

  • Junker: Recovering the Ability Distribution 16

    Theta - Normal, 3PL.10 Items Theta - Normal. 3PL, 30 terns

    0 0 42 0.4 0

    Tao 011440uson

    08

    Theta - Nomad, 3PL, 60 Items

    to

    0 0 02 04 0 $

    TM" Thee 01011141$0/1

    I la

    The These 010114004

    Theta - Netted, 3PL. 60 kerne

    0

    rifts

    a

    Figure 1: p p and density plots of EDF and KDE estimators. EDF is represented by stepfunction, KDE by curve. In the last panel. the true density is the dashed curve and theKDE-based density estimate is the solid curve.

  • Junker: Recovering the Ability Distribution 17

    make a good distribution estimate rather than to make a good density estimate).

    Pula - Normal, Estimated, 30 lama

    02 0 4 0 6

    Too Thset Diertmon

    t .0

    Theta - Normal, Estimated, 30 Items

    4 2

    Figure 2: g p and density plots of EDF and KDE estimators. EDF is represented by stepfunction, KDE by curve. In the second wenel, the true density is the dashed curve and theKDE-based density estimate is the solid curve.

    Figure 2 illustrates the performance of the estimators in Table 4. The left panel is ap p plot of the EDF (step function) and KDE (smooth curve) estimators for 30 items, and

    the right panel compares the corresponding KDE-based density with the true e density. In

    the Monte Carlo trial illustrated, contamination in the parameters of the "recovery' ICC's

    caused some bias and scale distortion in the estimated distribution, but the estimate stillcorrectly suggests that 0 has a Normal or bell-shaped distribution.

    In Tables 5, 6 and 7, in which e is bimodal, the KDE estimator is still doing betterthan the EDF. It is encouraging to see that the orders of magnitudes of the RMS and MAX

    measures of fit are the same as in the N(0, 1) case above. It is a little surprising that thefits can actually be better for the bimodal cases than the normal, but perhaps the greater

    variability is working in our favor here: we are getting mom, extreme-ability examinees with

    which to form F, and thus to estimate the tails of F. Finally, note that there is much less

    21

  • Junket.: Recovering the Ability Distribution 18

    difference in the fits of the 3PL and 'Estimated' 3PL cases.

    TestLength Estimator

    RMSave SD

    DeviationMAX LOC

    10 EDF 0.04769 0.00003 Th.12379 -1.36996KDE 0.03678 0.03003 0.06299 -1.25226

    30 EDF 0.01820 0.00003 0.04668 -0.61856KDE 0.01547 0.00003 3.02502 -0.42646

    60 EDF 0.01107 0.00003 0.02710 -0.25206KDE 0.00995 0.00003

    40.01622 -0.17576

    ,

    100 EDF 0.00870 0.00003 0.01923 -0.03886KDE 0.00817 0.00003 0.01290 -0.13216

    Table 5: e Bimodal, Rasch

    TestLength Estimator

    RMSare SD

    DeviationMAX LOC

    10 EDF 0.05268 0.00003 0.12160 1.08084KDE 0.03612 0.00003 0.09342 -4.44996

    ,,

    30.

    EDF 0.02268 0.00002 0.05616 -0.66696KDE 0.01877 0.00W2 0.04229 -3.68386

    60 EDF 0.01353 0.00003 0.03496 -1.24996KDE 0.01205 0.00003 0.02561 -2.75386

    100 EDF 0.00998 0.00003 0.02457 -1.22086KDE 0.00924 0.00003 0.01860 -2.64946

    Table 6: ê Bimodal, 3PL

    Figure 3 illustrates the performance of the estimators in Table 6, for 60 items. Again.

    the left panel is a p- p plot of the EDF (step function) and KDE (smooth curve) estimators

    and the right panel depicts the KDE-based density estimate. Once again the estimated

    distribution provides good estimates of probabilities under the true distribution, and the

    corresponding density estimate tracks the two modes of the true e distribution reasonably

    well.

    22

  • Junker: Recovering the Ability Distribution 19

    TestLength Estimator

    RMSave SD

    DeviationMAX LOC

    10 EDF 0.Mi ifit 0.00005 0.14624 0.78714KDE 0.05101 0.00005 0.09497 -4.97589

    30 EDF 0.03203 0.111005 0.08038 -2.37405KDE 0.02958 0.00005 0.06457 -3.38695EDF 0.01386 0.00003 0.03747 -1.11546KDE 0.01245 0.00003 0.02796 -2.63776

    100 EDF 0.01120 0.00004 0.02776 -142786KDE 0.01055 0.00004 0.02134 -2.29616

    Table 7: 9 m. Bimodal, Estimated

    rola - Bimactai. 3PL. 60 hems mak - Bimodat. 3PL 60 Rims

    TRIO MO DelftAMOR Theft

    Figure 3; p - p and density plots of EDF and KDE estimators. EDF is represented by stepfunction. KDE by curve. In the second panel, the true density is the dashed curve and theKDE-based density estimate is the solid curve.

  • 110

    Junker Recovering the Ability Distribution20

    In Tables 8, 9 and 10, note how gradual the decrease in MAX is; this can be attributal

    partly to the fact that e "doesn't know" that F assigns no mass to the interval (-oo, -1)

    and thus freely places j's there. so that Feat is grossly overestimatingF for 0 < -I. 'This

    certainly explains why LOC is near -1. in a but one case. It seems remarkable that the

    RMS should drop as muc4: as it does, considering the fact that the Normal weightingfunction

    g assigns significant weight to the region near or below 8 = -1. Once againthem is little

    difference between the 3PL and 'Estimated' 3PL eases. Finally, note that the EDF estimator

    is doing better than the KDE estimator in many cases here. Our ad hocchoice of h is

    probably failing u.s here by being too large to track the "sharp upturn' in F at -1.

    TestLength Estimator

    R.MS

    ave SDDeviation

    MAX LOC

    10 EDFKDE

    0.099220.09241

    0.000040.00003

    ,

    0.233520.20600

    -0.26996-1.00996

    30 EDFKDE

    0.054040.05508

    0.000030.00113

    0.146080.17924

    -0.91796-1.00996

    60 EDFKDE

    0.038120.04010

    0.000030.00003

    0.159930.16010

    -1.00996-1.00316

    100 EDFKDE

    0.029440.03215

    0.000030.00003

    0.152460.14717

    -0.99996-0.99996

    Table 8: e X.2 1, Rasch

    5 Discussion

    To implement this scheme in practice, one must numerically invert the average ICCP3 for

    the test in question at or near the J4-1 possible values of T.T. After this, a table constructed

    from the inversion can be used simply and cheaply to estimate 0 distributions for each

    of several adrninistrations of the same test, or each of several subpopulations in asingle

    administration. For shorter tests lengths the basic statistic ij may need to be resealed,

    I I

  • Janke:: Recovering the Ability Distribution 21

    Test RMS DeviationLength Estimator ave SD MAX LOC

    10 EDF 0.11871 0.00004 0.30689 -1.00996KDE 0.10699 0.00004 0.28934 -1.00996

    30 E1M 0.22700 -1.00996KDE 0.071 0.00004 0.23167 -1.00996

    60 EDF 0.05291 0.00003 0.20477 -1.00996KDE 0.05408 0.00003 0.20211 -1.0M96

    100 EDF 0.04153 0.00003 0.19628 -0.99996KDE 0.04365 0.00003 0.18294 -1.00976

    Table 9: 0 X3 1, 3PL

    Test RMS DeviationLength Estimator ave SD MAX LOC

    10 EDF 0.11387 0.00005 0.30689 -1.00996KDE 0.10600 0.00005 0.33073 -1.00996

    30 EDF 0.08264 0.00005 1 0.32359 -1.00996KDE 0.08161 0.00005 0.30244 -1.00996

    60 EDF 0.05322 0.00003 0.20477 -1.00996KDE 0.05466 0.00004 0.21590 -1.00996

    100 EDF ' 0.04303 0.00004* 0.20150 -1.00996KDE 0.04491 0.00004 0.20859 -1.00646

    Table 10: 0 x2 - 1, Estimated

    25

  • Junior: Recovering the A bility Distribution 22

    as we have done with i()), to effectively estimate F. Kernel smoothing of the estimated

    distribution (KDE) is also quite helpful. Work is currently underway (Nandakumar and

    Junker, 1992) to further examine and refme these methods using essentially unidimensional

    simulation data, and to app!y the estiinistors to real tests.

    Because it is fest, this scheme could be a.lso be used for some diagnostic purposes. For

    example, if ICC's were estimated under the assumption of a Normal underlying 0 distribution

    and a 3Pli model, the KDE estimate of the 0 distribution could be plotted on a Normal

    probability plot to examine jointly) the assumptions about distribution and ICC forms. Or

    the 9 distribution estimates under two ICC estimation techniques could be compared to see

    how well they agree: Quite different ICC forms or parameter sets could in principle lead

    to very similar ei distributions; if so then for many purposes it would then be a matter of

    indifference which ICC's were used, so considerations such as cost of ICC estimation, etc.,

    could come into play. Finally, it may be possible to estimate the 0 distribution sufficiently

    accurately with, say, Rasch ICC's (for which item parameters can be estimated independently

    of the 0 distribution), and then use that estimate as part of a marginal xnaximum likelihood

    approach to estimating item parameters in a 3PL model which more accurately models the

    item response behavior.

    References

    Ash, R. (1972). Real Avalys s and Probability. New York: Academic Press.

    Birnbaum, A. (1968). Some latent trait models and their use in inferring an examinee's

    ability, in Lord, F. and Novick, M. (1968). Statistical Theories of Mental Test Scores.

    Reading, Mass: Addison-Wesley.

    Hambleton, R. K. (1989). Principles and selected applications of item response theory. In

    Linn R.L. (ed.) Educational Measurment, Third Edition. New York: MacMillan, pp.

    0 6

  • Junker: Recovering the Ability Distribution 23

    147-200.

    Hoeffding, W. (1963). Probability inequalities for sums of bounded random variables. Jour-

    nal of the Americ- Statistical Association, 58, 13-30.

    Janker, B. W. (1991). Essential independence and likelihood-based ability estimation for

    polytomous items. Psychometrika, 56,255-278.

    Levine, M. (1984). An introduction to multilinear formula score theory. University of Illinois

    and Office of Naval Research, Research Report 84-4.

    Lord, F. (1953). The relation of test score to the trait underlying the test. Educational and

    Psychological Measurement, 13, 517-549.

    Mislevy, R. (1984). Estimating latent distributions. Psychometrika, 49, 359-381.

    Nandakumar, R. and Junker. B. W. (1992). Estimating the latent ability distribution. To

    be presented at the Annual Meeting of the Psychometric Society, Columbus OH, July

    1992.

    Rubinstein, R. Y. (1981). Simulation and the Monte Carlo Method Method. New York: John

    Wiley and Sons.

    Samejima., F. (1984). Plausatility functions of Iowa Vocabulary Test items estimated by the

    simple sum procedure of the conditional P.D.F. approach. University of Tennessee and

    Office of Naval Research. Research Report 84-1.

    Samejima, F. and Livingston. P. (1979). Method of moments as the least squares solutior for

    fitting a polynomial. l`niversity of Tennessee and Office of Naval Research, Rearch

    Report 79-2.

    Serfling, R. (1980). Appronmation Theorems in Mathematical Statistics. New York: John

    Wiley and Sons.

  • Junket.: Recovering the Ability Distribution 24

    Silverman, B. (1986). Density Estimation for Statistics and Data Analysis London: Oxford

    University Press.

    Stout, W. F. (1987). A nonpararnetric approach for assessing latent trait unidimensionaiity.

    Psychometrika, 52, 589-617.

    Stout, W. F. (1990). A new item response theory modeling approach with applications to

    undimensionality assessment and ability estimation. Psychometrika, 55, 293-325.

    Appendix A. Details of the simulation

    For each simulated data set, M Monte Carlo trials were run (one trial entails sampling N examinees,

    generating a 10 and J item responses for each examinee, and constructing the distribution estimates

    PNJ and PNJA from these). In our simulation, Al was taken to be 100. In the discussion below,

    Fe,t stands for eittor of the two distribution estimates tried.

    For each triot two measures of fit to the true ability distribution F were reported. First, the

    value of

    = maxiiFest(ti) tot ...,t1200}

    was calculated, for ti's ranging from 6 to 6 spaced at 0.01 intervals, as an approximation to

    5 supilF,(t) )1;t E (--cV,00)}

    as well as the value L = t at which 3 was attained. Second, an approximation to the squared

    distance11co

    ,0[11,4(0 - F(t)12g(t)dt

    was calculated. where the weight function g was taken to be the standard normal density. The

    approximation used was the Monte Carlo approximation

    :2 It'.- [Fest(To F(T012,

    where are iid with marginal density g, and K = 500 for our simulation.

  • Junket.: Recovering the Ability Distribution 25

    Finally, Monte Carlo sample averages

    1 M1 M3 = E .LIELm, and r2 = E4g4 naini

    were computed, as well as sample standard deviatiolis. 3 estimates EgbI estimates EN, and

    I estimates fEti3J}h12 standard deviation for 7 was estimated using the delta method (Serfiing,

    1980, p. 118).

    Ef371 may be regarded as a reasonable approximation to MAX = EV]. Because of the dis-

    cretization in calculating and L, Ern probably is not as good an indication of Ile true value

    LOC = t where the distributions are firthest apart, but it may still be of some descriptive value.

    Finally, {Er/211112 is exactly

    RMS = E 1 1F.,t(t) F(t)rg(t)dt}1/2

    The pseudo-random number generators used were linear congruential generatcra (see Rubin-

    stein, 1981)

    = (a c) mod rn,

    using a = 75,c = 0, m = 231 for generating O's and a = 27 + 1, c = 1, m = 235 for generating

    item responses. Normal observations were obtained from these unifolm observations by the polar

    transformation

    = y'-2log 111 cos 22rU2

    Z2 = /-2iogY1siu2rU2

    and the bimodal mixture and l

  • Junket.: Recovering the Ability Distribution 26

    Appendix B ProofsProof of Theorem. 1: Observe that, for any c > 0,

    PliftN,7(e) F(e)1 El P [IPN,J(49) Fj(e)i IF-7(e) ges)i

    P [ItN,j(e) FJ(.3)1 c/2] (for large

    < C e-2N(40?

    for some universal constant C, and N large. (Serfiing, 1980, p. 59). This tends to zero as N oo.

    0

    Proof of Theorem 2: Observe that

    KV (ri) 5_ 8] = Pr' .7 .S71.7(9)]

    = PIPP(X.r) 5. T311149)]

    = (Xi) + r(9) 75-7N.7(e) r(9)1.

    By Slutsky's Theorma, since r(6) = liriij 757,1171(9) we know that 75(17j)-f-r(8) and 7571(i.r)

    have the same asymptotic law, i.e. for any t.

    Pli2;1(rj) r(8)- /77%(0) < ti F(i).

    Then in particular for t =

    FT;10-C./ + r(9) j(e)L(8) 5_ r(9)] Pir(8)).

    The assertion about uniform convergence follows from a theorem of Polya 1080, .18).

    Proof of Theorem 3: In the following calculation, it will be helpful to let Y be a random variable

    with distribution K independent of 0 and all item responses. Squaring (6),

    00RMS2 = E f iFNJatt) 110129(0dt

    E E NT,2=0

    2

    t ( ).1)K pie til g(t)dt

  • Junker Recover* the Ability Distribution 27

    00 r

    =e- [bic1.94t )12 + [varianc(t)1} g(t)dt

    {P1v[X* = K t -75771(.1Er.7=0 h

    var PNEXJ = j/JVC-co

    2

    pfe tl} g(t)dt

    ft-T;11/1h

    = r {PIP-31 (X. j) + lay t] pie t)}2 g(t)dt

    I {t 75.71(X,r)i}g(t)dtN n=i

    hY < ti PIO < tn3g(t)dt

    + f VaxK[tN g(t)dt

    1

    (bias)jh (variance)N.m.

    Note that (bias)2NA does not depend on N. As long as

    EIY1 = julK(u)dti < oo,

    we will have hY 0 in probability, so that by Slutsky's Theorem the distributions of 15,71(174 +hit

    and P.71(Xt) will converge to the same thing, namely F(t) = P[O < tl, at every t (we are assuming

    F is continuous) as J -4 oo and h and h -4 0. Hence the integrand of (bias)2Njh converges to

    zero at each t, and if g(t) is a density it follows that (bins)tuk 0 as J -+ oo and h -* 0 (and Nis free).

    On the other hand, for each fixed I, h, t the random variable

    ft -75;1(11

    is bounded between 0 and 1. hence if g(t) is a density we have for each fixed J and h

    g(t)dt

    K[t 1-7-'.11(X.r)}g(t)dt < 1.

    Multiplying by 1/N it is clear that variance)NA 0 as N oo uniformly in J and h. Thisproves Theorem 3. 0

    31

  • Of. Terry AdiermanEducadonat Psychology210 emu= Bidg.Universny of liftnoisChampaign. & 61801

    Dr. Ronaid ArmstrongRutgers UniversityGraduate Schooi c4 ManagementNewark. NJ 07102

    Dr. WUlism M. BartUnivensny of MinnesutaDeo. of Educ. Psychology330 Burton Han178 Pillsbury Or., S.E.Minneapolis, MN 55455

    Dr. BM= fibromDefense Manpower Data Center99 Pacific St.

    Suite 155AMtunerey, CA 93943-3231

    Dr. Robert BiennenAmerman College Testing

    ProgramsP. 0. Box 168lows City, IA 52243

    Dr. John M. CarrollIBM Watson Research CenterUser interface instituteP.O. Box 704Yorktown Heights. NY 10598

    Mr. Hua Hue ChungUniversity of WinoisDepartment of Statistics101 Mini Hat725 South Wright St.Champaign, IL 61820

    Dr. Stanley ColtverOffice of Naval TechnologyCode 222800 N. Quincy StreetArlington. VA 22217-5000

    Dr. Tummy DaveyAmencan Collage Testing ProgramP.O. Box 168Iowa City. IA 5224.3

    Dr. Lou MelloCERLUniversity of Illinois103 South Mathews AvenueUrbana, IL 61801

    Dr. Fritz DrasgowUniversny of IllinoisDepartment of Psycnotogy803 E. Daniel StChampaign. IL 61820

    Dr. James Akins1403 Norms HalUniversity or FloridaGairesvide. FL 32805

    Dr. Eva L BakerUCLA Canter kr the Study

    of Evaluation145 Moore HallUniversity of CaliterniaLos Angeles. CA 90024

    Dr. Isaac 8k4erLaw School Admiaskins

    ServicesP.O. Brat 40Newtovnt, PA 18940-0040

    Cdt. Arnoid BoberSectie Piwohologisob OndeizositRelouteengeEn 13441COMOOMUMKwarder Koningen AstridSruikettast1120 Brussels. BELGIUM

    Dr. Gregory Genoa07841aGinwr4V12500 Garden ReadMonterey. CA 93940

    Dr. Robert M. CanoeChief of Navel Opera:kmOP-01112Washington, DC 20350

    Dr.. Norman CatDePtiftMere at PerflabgyUniv. of So. CdtorniaLos Angeles. CA 9008940131

    Dr. Haas F. CrombegFaculty of LawUniverse! of LimburgP.O. Box 616MatastncMThe NETHERLANDS 62U0 MD

    Dr. C. M. DayamDepartment of Measurement

    Statistics & EvnitiatronCollege al EducationUnivorsay of MarylandCollege Perk, MD 20742

    Dr. Dattpmad DivgiCenter tor Navai Analysts4401 Ford AvenueP.O. Sax 16288Alexanoria, VA 22302-0268

    Defense TechnicalInformation Center

    Camemn Station. Bldg 5Alexandria, VA 22314

    RR

    32

    Dr. Eding B. AndersenDepammint al Steaks-Studeatizeide 61455 OapanhagenDEYSAMVC

    Dr. Lava L BainesCrane of EducationUnhavaiW of Toledo2801 W. Sanaoft SkeetToledo, OH 43606

    Dr. Menucha BirenhaumSchool of EducationTel Aviv UniversityRamat Aviv 69978mama

    Dr. Robe! BreauxCode 01Naval Training Systems CenterOdando, FL 328264224

    Dr. Joint S. Carroll449 Mott Rd, NorthChapel NM NC 27514

    Dr. Raymond E. ChdatalLES LAMP Science AdvisorAFFNACELBrooke AFB. TX 78235

    Maar.Manpower SuMort erldReadiness ProgramCorder for Navel Analysis4401 Ford AvenueP.O. Box 111268Alexandria. VA 223020268Ms. Carolyn a CM,Johns Hearin* UniversityDeranntern of PsychniogyCharles & 34th StreetBaltimore. MD 21218

    Dr. Ralph J. DeAyalaMeasureromt. Statistios.

    and EvaluationBenismin Bldg.. Rrn. 4112University of MarylandCollege Park, MD 20742

    Mr. Hei-Ki DungBell Communications RosenfeltRoam PYA-1K207P.O. Box 1320Place:away. NJ 08355-1320

    Dr. Stephen Dunbar224E1 Linder* COOK

    *or MeasurementUniversity of Iowaboa Cit14 1A 52242

  • Dr. James A. Ear lasAir Force Human Ree0Lifeal LabBrooks AFL TX 78235

    ERIC Facility-Accursitions2440 Resew= Blvd. Suite 550Rockville. MD 208504238

    Dr. P-A. Fed.=Code 51NPRDCSan Diego, CA 92152-6900

    Dr. Gerhard FocherUshiggasse 513A 1010 View*AUSTRIA

    Mr. Paul FoleyNavy Personnel R&D CenterSan Diego. CA 92152-6800

    Dr. Janice GifordUniversity of MeeeecnUsetteSchool of EducationAmherst, MA 01003

    Dr. Sheme GottAFHRUMOMJBrooks AFB. TX 78235-5601

    Prof. Edward HaertelSchool of EducationStanford UniversityStantort, CA 94305

    Or. Grant HenningSenior Research SciermstDivision of MeasurementResearch arxi ServrcesEducational Testmg ServicePrinceton, NJ 08541

    Dr. Paul W. HollandEducadonal Testing Service, 21-1Rosedale RoadPrinceton, NJ 08541

    Dr. William HowellChief SCiertestAFHRUCABrooks AFB. TX 78235-5601

    Dr. Susan EmbretsonUniversity at KansasPsychabgy Depanment426 FraserLawrence. KS NOS

    Dr. Benjamin A. FabbanitOperationalTechnoiogiss Coip.5825 Callaghan. Sults 225San Antonio, IX 78228

    Dr. Leonard Feb*Undquit Canter

    for MeasurementUniversity of benIrma Cky, IA 52342

    Dr. Myron Fisch'U.S. Army HeadquartemDAPE-MFFIThe PentagonWeshin91081, OC 20310-0300

    Dr. Alfred R. Fre*AFOSRML Sidg. 410Baku; AFB, DC 203324448

    Dr. Drew GamerEducatxmal Testing ServicePrim:won, NJ 08541

    E 3ert GreenJoists Hopkins UnivensityDepartmere of Psych:lbwCharles & 3481SheetBaltimore. tiff3 21218

    Dr. Ronald K. HarntaacetUniversity of MassachusettsLaboratory of Psychometric

    and Evaluative ResearchHills South, Room 152Amherst, MA 01003

    Ms. Rebecca HatterNavy Personnel R&D CenterCode 63San Dis9o, CA 92152-8800

    Dr. Paul Horst6776 Street, It184Chula Vista. CA 92010

    Or. Uoyd Hun:WaysUniversity of Sinai*Department of Psychobgy603 East Daniel StreetChampaign, L 61820

    Dr. Growl Englehred. Jr.Mid= Educeatmei StudiesEmory tJolvansay210 figibuntsMints. GA 3011:2

    Dr. Mantua J. Fern ConestogaCognitive & Insatictional Sciences2;9111 *nth Vernon StreetAilkoost, VA 22207

    Dr. Maid L FergusonAmami College TeasingP.O. Box 168Iowa My, IA 52243

    Pmt.Donsid RzgrealdUniveney of New EnglandDepsionent of Psyd10k4WMukluks. New Sauer Wales 2351AUS1RAUA

    Et. Robert D. Gibbonsnooks State Psychiatric Inst.Rie 528W1801 W. Tayfor SheetMown IL 60612

    Dr. Robert GlowLearning Research

    & Chweitoment CenterUnlveraiW of PlItsbeigh3938 Mans StreetPlasittagh, PA 152110

    Wheel HobanDORMER GMBHP.O. Box 14200-7990 Frisdrichaluden 1WEST GERMANY

    Or- Datal/fn HarnischUnhorse! of INksais51 Getty DWGCharaPte0n, IL 61820

    Dr. Thomas M. HirschACTP. 0. Box 168tows City, IA 52243

    Me. Julia S. HoughCambridge Universrty Press40 West 20th StreetNew York, NY 10011

    Dr. Steven Hunks3-104 Edw. N.Universrty of MartaEdniontork MantaCANADA TSG 2135

  • Dr. Huy*, HuynhCollege *I EducationUniv. of South CarotinaColumbia, SC 29208

    Dr. Douglas H. Jones1280 Woodfem CourtToms River, NJ 08753

    Dr. Milton S. KatzEuropean Science CoonlMatim

    OfficeU.S. Army Research InstituteBox 65FPO New York 09510-1500

    Dr. Soon-Hoon KimKedi924 Umyean-DangSeocrio-GiuSeoulSOUTH KOREA

    Dr. Richerd J. KoubekDepanment of Biomedical

    & Human Factors.139 embalming & Math Bldg.Might State UnivemkyDayton, OH 45435

    Dr. Thomas LeonardUniversity of WlsomeinDemme& of Statistics12t0 West Dayton StreetMadison, WI 63705

    Mr. Rodney LimUniversity of It boleDeosnment of Psychology603 E. Daniel St.

    ChamPal9n- 11- 81820

    Dr. Frederic M. LordEducational Testes; SetvicePrinceton, NJ 08541

    Dr. Gary MarcoStop 31-EEducational Testing ServicePrinceton. NJ 08451

    Dr. Clarence C. McCormickHO. USMEPCOMNIEPCT2500 Green Bay RoadNorth Chicago, IL 60054

    Mr. Alan Meadcro Dr. Michael LAVInilEducational Psychology210 Education Bldg.University of IllinoisChicaao IL 61801

    Imp! I 111 1111 I I I

    Dr. Robed JannmoneElms end Computer Eng. DeptUnivemey of South Cambria001UMbilt. SC 29208

    Dr. Brien JunkieCarnegie-WWII UniversityDernstment of StatisticsSchenker ParkPittsburgh, PA 15213

    Prof. Jolm k KeatsDepestmina of PsychokigyUrftimity of NewocialeN.S.W. 2306AUS1RALIA

    Dr. G. Gage KingsburyPortland Public &hornsReseerch and Evaluation Deparunent501 North Dixon StreetP. O. Box 3107Portland, OR 97209-2107

    Dr. Leonard *eskerNavy Personnel R&D Center

    Code 62San Diego, CA92152-6800

    Dr. Michael LevineEducational Psychology210 Educelkin t509.tinker*, of WO*Champaign, L 81801

    Dr. Robed L Urn,Campus Box 249Urkvereity of ColoradoBoulder, CO 80309-0249

    Dr. Richard LuechtACTP. 0. Box 168Iowa City, IA 52243

    Dr. Giessen J. ManinOffice of Chief of Naval

    °emitters (OP 13 F)Navy Amex, Room 2=Washington. DC 20350

    Mr. Chritemher McCuskerUniveisity of SkimsDeportment of Psychology603 E. Daniel St.Champaign, IL 61820

    Dr. Timothy MillerACTP. O. Box 168bwa City, IA52243 3 4

    Dr. Kumar Jorig-devUniversity of Wm*Deomment of Statratks101 Sini Hall725 South Wright SkeetChampaign, IL 61820

    OT. Wheat KaplanOnce of Basic ResearchU.S. Army Reseankit buena.5031 Eisonhower AvenueAlexancida, VA 22333.=

    Dr. Jere-ram KimDepartment of Psychology%Who Tennessee Stat.

    UntinimityP.O. Box 522Murfreesboro. TN 37132

    Dr. Willarn KochBox 7246. Mese. end Eval. Cir.University of Texae-AustinAustri, IX 78703

    Dr. Jeny LehnueDeters* Manpavrer Data CenterSuite 4001600 Mem BlvdRoteiyo VA 22209

    Dr. Chides LewisEducational Testing ServicePrinceton, NJ 08541-0001

    Dr. Robert LockmanCenter to, Naval Analysis4401 Ford AvenueP.O. Box 18268Alexandria. VA 22302-0268

    Dr. George B. MacreadyDsoartment of Measurement

    Statistics & EvidustbnCollege of EducationUniversity of MarylandCollege Park. 11) 20742

    Dr. James R. McBridefluirdiF06400 Elmhurst DriveSan Diego GA 92120

    Dr. Robert McKkileyEducational Testing ServicePrinceton. NJ 08541

    Dr. Robert MielevyEducationte Testing ServicePrinceton. NJ 08541

  • Dr. William thxdagueNPRDC Code 13San Diego, CA 92152-6800

    Dr. Rama NandakumarEducational StudiesWi lard Hat Roam 213EUniversity of DelawareNewark. DE 19716

    Dr. Hewitt F. (mg. Jr.&hind at Education - WPH 801Department of Educe timer

    Psychology & TechnologyUniversity of Southern CaWomieLos Angeles. CA 90089-0031

    Dr. Judith OrasenuBasic Research Oft*Anny Research Menke.5001 Eisenhower AvenueAlexandria, VA 22333

    Wayne M. PatkinceAmerican Council on Educed=GED Testing Sarital. Suite 20One Dupont Mole. NWWashington, DC 20036

    Dr. Mark D. ReckaseACTP. O. Sox 168Iowa City. iA 52243

    Dr. Cad RossCNET-PDCDBuilding 90Great Lakes NTC. IL 60088

    Mr. Drew SandsNPROC Code 62San Diecye, CA 92152-6800

    Dr. Dan SegallNavy Personnel R&D CenterSan Diego, CA 92152

    Dr. Randall ShumakerNaval Research LaboratoryCode 55104555 Overlook Avenue, S.W.Washington. DC 20375-5000

    Dr. Judy SprayACTP.O. Box 168Iowa City. IA 52243

    Ms. Kathleen MorenoNeel Personnel R&D CenterCode 62Sass Diego, CA 921524800

    Lbrary, NPRDCCode P2011-San Diego, CA 921524000

    Dr. James B. OlsenWICAT Systems1875 Sarah State StreetOrem, UT 84058

    Dr. Jesse Orienekykistitute for Odense AnalysesMI ttBI.grdSt.Alesmstirts, VA 22311

    Dr. James PaubwrDepanment of PsychtdogyPortland Slate UniversityP.O. Box 751Porthmd. OR 97207

    Dr. Malcolm ResAFFIRLAIOASmoke AFB. TX 78235

    Dr. J. RyanDeperummt of EducationUrtitfOrlfit f Of South CarolinaColundsis, SC 29208

    Lowed &hoerPayarsoiogicai & Ouamitative

    FoundationsCollege of EducalkmUniverse), of IowaIowa City. IA 52242

    Dr. Robin ShealyUnlveray of WinoisDepartment of Statkitioa101 Mini Hail725 South Wrigiri St.ChamPaillh. a. 61820

    Dr. Richard E. SnowSchooi of EducationStanford UniversityStanford, CA 94305

    Dr. Martha StacidngEducationai Testing ServicePrincerces, NJ 08541 35

    Headwaiters Marino CorpsCode 10140Washkeyon. DC 20380

    LihntrienNamd CeMer for Applied Resew=

    in Atietshd IntelligenceNaiad Resew* tAboratoryCods 5510Washington, DC 203754000

    0151* of Nand Reessoli,Cade 1142C8

    800 N. Ouincy StreetArlington, VA 222174000

    Dr. Peter J. PeshleyEduistionai Testing ServiceRosedale RoadPrinteson. NJ 08541

    Dept. of Mtn' SciencesCode 54

    Nat* Postumduate SchoolMonterey. CA 939434026

    Me. Stove ReissMC aka HalUtdversity of kgretarsola75 E. War ReedMinneepods, WI 554554344

    Dr. Funs** SamerimaDePertment of PsychologyUniversity at Tennessee3108 Austin Petty Bldg.Knoxvdie, TN 37916-0900

    Dr. Mary Schram4100 ParkeideCarlsbad. CA 92008

    Dr. Kazuo Shigemasu7-944 Krigenume-KeiganFulsaws 251JAPAN

    Dr. Richard C. SorensenNavy Personnel R&D CenterSan Disgo. CA 921524800

    Dr. Peter StabffCenter tor Naval Analysis4401 Ford AmmoP.O. Box 16268Aiseendrks. VA 223024268

  • Dr. %aim StoutUniversity of Nino lo[imosa of Statistics101 ISM Nal725 South Wright St.Champaign, 4. 61320

    Dr. John TangneyAFOSRiNL. Bldg. 410Boling AFB. DC 203324448

    Dr. David ThomDepanment of PsychoiogyUniversity of KansasLawrence, KS 66044

    Dr. Robert Taut:wawaUnivinvey of MissouriDapartmant of *Metes222 Math. Sciences Bldg.Columbia, MO 55211

    Dr. Frank L MoineNavy Personnel R&D CenterSan Diego. CA 92152-6800

    Dr. ating-filei WogEducational Tsang ServiceMal Stop 0 -TPrinoston. NJ 09541

    Dr. David J. WeiseN660 Menai'Unkrersity of Minnesota75 E. River RoadMinneapolis. Mt4 55455-0344

    Dr. Douglas Wets&Cods 51Navy Personnel R&D CanterSan Diego, CA 92152-6800

    Dr. Brum WillamsDepartment of Educational

    PsychologyUnkersity ot IBMsUrbana, IL 81801

    Dr. George On131' LabersioryMemorial Sloan-Kettering

    Came Comer1275 York AromaNew York, NY 10021

    Dr. Wendy YenCTB/McGrawDel Monte Rosearth ParkMoreeny, CA 93940

    Dr. Hems= SwarnatimanLationatil of Psychomeno and

    E R esearchSchool of EducationUnimetty of MassectumettaAmherst, MA 01003

    Dr. leant TampicoEducational Teetkig ServiceMal Stop 03-TPrinceton. NJ 08541

    Mr. Thomas J. ThomasJohns Napkins UntyingDepartment ri PsychologyCharles & 34th &maBaltimore., MD 21218

    Dr. Legard TuckerUnwiring of OwlsDepanmel of Psychology603 E. Daniel StreetChampaign, L 61820

    Dr. Howard WeinerEducational Testing ServicePrinceton. NJ 08541

    Dr. Thomas A. WannFAA Academy AAC9341)P.O.. Boa 23062Cidahoma City, OK 73125

    Dr. Ronald A. WeitzmanSox 148Carmel, CA 93921

    Dr. Rand R. ManUniversity of Southem

    CataniaDepanmere of PsychologyLos Angeles, CA 900804061

    Dr. Ma WingFederal Aviation Administration800 kidependence Ave. SWWashington, DC 20591

    Dr. %lace Wulfedr. UINavy Personnel R&D CenterCode 51San Digo, CA 921524800

    Dr. Joaloh L YoungNationfid Sconce FoundationRoom 3201800 Smet. N.W.Washington, DC 20550

    Mr. Brad SympsonNavy Pamennei R&D ComerCodb62Sao Diego. CA 9215241100

    Dr. Mambo TamaraEdutistional Twang ServiceMad Stop 034Prinotion. NJ 09541

    Mr. Gory ThomasonUniversity of WOOEd:Mond PaychoeigyChampaign. IL 611120

    Dr. Dint ValeAstressonent Systems Cana-2233 University AvenueSults 440St. Paul, MN 55114

    Dr. klcasel T. WailerUneremay 01 Wisconsin-MilwaukeeEdmond Partiotogy DepartmentBox 413Milwartiers, WI 5=01

    Dr. Brian WatsmNumR1101100 $. WarthkeitonAismatiie, VA 22314

    MelorJohn WelshAFFIRLAIDANBrooks AFB, TX 78223

    Gomm Mary ReoreeentsaveAT1Tt lAbloarq Wildgnibe

    StralkateasamtD-6300 Bonn 2

    4000 Brandimins Street NWWashington. DC 20016

    Mr. John IL WaftNoy Personnel R&D CenterSea Digo, CA 921524800

    Dr. Kentaro Yamamoto02-TEckatational Tomtits; ServiesRosedale RoadPrinceton, Nif 03541

    Mr. Anthony R. ZaraNational Council of State

    Ikons of Nursing. Inc.625 Noah Midtipm AvenueSue 1644-ChIcaor). IL 60611