10
Behavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINE COMPUTER APPLICATIONS IN PSYCHOLOGY FRANK RESTLE, Indiana University, Presider The impact of computers on cognitive psychology DORIS AARONSON and EDWARD GRUPSMITH New York University, llew York, New York 10003 and MAY AARONSON National Institute of Mental Health, Bethesda, Marylnnd 20014 Computer developments and their applications in cognitive psychology are reviewed. Examples from recent studies illustrate the ways that computers are used for different research purposes: stimulus generation, on-line interactive experimental control, response collection, data analysis, and theory building. A quantitative analysis of federal funding for computer-based and noncomputer research compares costs over the past 9 years for the areas of perception, memory, learning, and thinking. A tabulation of journal articles relevant to computer-based cognitive research shows the distribution of articles over various categories of hardware and software development. Finally, the advantages and disadvantages of using computers in cognitive research are evaluated. HISTORICAL OVERVIEW In evaluating the impact of computers on cognitive psychology. it is reasonable to begin by examining the past history and trends. The history of computers in cogn itivc psychology is short and recent-about 25 year". The most striking aspect is the increasing speed \\ ith \\ hich computers have been incorporated into all phases of cognitive research. Figure I illustrates a "cumulative record" of landmark events in the development of computers and their application in cognitive psychology. The abacus. perhaps the tirst computing device. was in use some 5.000 years ago. It used a bi-quinary number system that \\ as later used on the console display of the IBM 650. The tirst mechanical computer was built by Pascal. and a better device was built by Leibnitz in 16""3. Charles Babbage designed the tirst large computer, the difference engine, in England in 1812, to calculate tables of mathematical functions. In 1833. he planned the analytical engine. the forerunner of modern general-purpose computers. Although this machine was never constructed. its design innovation Ihe preparation of this paper and some of the data in Secuon II were supported In part by Grant MH 16.496 to the tirst author. The tinaucial data for Secnon III were provided by the National lnsuuue of Mental Health. The authors thank Betty Pickett. John Hammack. Nllcs Bernick. Anne Bergquist. Jane Dav. Helen Schcuch, and Ethel Roscn of the National Institute of Mental Hcalth for adruinrvtrauve and technical help. We thank Don Norman. Doug Ohman. and Elliott Epps for helpful comments on earher dratt-, Reprint requests should be sent to Dr Doris Aaronson. Psychology Department. Room 858. (, Washmgton Place. NCII York Umvcrvuv. New York. Ne\\ York lOOOJ. \\ as the use of preprogrammed sequential control of arithmetic operations. About SO years later. in 1889. Dr. Herman Hollerith patented the Hollerith punch card. \\ hich IBM used when it was founded in 1911. As psychologists, we should note Hollerith's early use of punched data with an automatic preprogrammed machine to tally social and behavioral census data. The next set of breakthroughs occurred after a 30- to 40-year gap. In the 1930s, Dr. Howard Aiken described the tirst modern preprogrammed computer. the Mark I automatic sequence controlled calculator, completed in 1944 at Harvard University. It used punched paper tape input and had a memory consist ing of thousands of electromagnetic relays. T\\ 0 years later. the ENIAC (electronic numerical integrator and computer). designed by J. P. Eckert and J. W. Mauchly. was completed at the University ot Pennsylvania. Its important contribution was operating speed. achieved by electronic internal operation. T\\ 0 additional hardware innovations were made in 1945 by Dr. John Von Neumann. He proposed to use binary rather than decimal numbers and to store the program instructions in memory. Until that point. control sequencing was external: dials. cards. paper tape. or plugboards. Memory was used only to store data. By 1949, the EDV AC (electronic discrete variable automatic computer) and the EDSAC (electronic delay storage automatic com puter) \\ ere bu ilt using Von Neumann's prmcrples. as well as memory made with ultrasoruc speed mercury-delay line". DUI'Illg the next decade. in tact. during the very 129

Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

Embed Size (px)

Citation preview

Page 1: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

Behavior Research Methods & Instrumentation1976, Vol. 8 (2),129-138

SESSION VIICONTRIBUTED PAPERS: EVALUATIONS OF

ON-LINE COMPUTER APPLICATIONSIN PSYCHOLOGY

FRANK RESTLE, Indiana University, Presider

The impact of computers on cognitive psychologyDORIS AARONSON and EDWARD GRUPSMITHNew York University, llew York, New York 10003

and

MAY AARONSONNational Institute of Mental Health, Bethesda, Marylnnd 20014

Computer developments and their applications in cognitive psychology are reviewed. Examples fromrecent studies illustrate the ways that computers are used for different research purposes: stimulusgeneration, on-line interactive experimental control, response collection, data analysis, and theorybuilding. A quantitative analysis of federal funding for computer-based and noncomputer researchcompares costs over the past 9 years for the areas of perception, memory, learning, and thinking. Atabulation of journal articles relevant to computer-based cognitive research shows the distribution ofarticles over various categories of hardware and software development. Finally, the advantages anddisadvantages of using computers in cognitive research are evaluated.

HISTORICAL OVERVIEW

In evaluating the impact of computers on cognitivepsychology. it is reasonable to begin by examining thepast history and trends. The history of computers incogn itivc psychology is short and recent-about 25year". The most striking aspect is the increasing speed\\ ith \\ hich computers have been incorporated into allphases of cognitive research.

Figure I illustrates a "cumulative record" oflandmark events in the development of computers andtheir application in cognitive psychology. The abacus.perhaps the tirst computing device. was in use some5.000 years ago. It used a bi-quinary number systemthat \\ as later used on the console display of theIBM 650. The tirst mechanical computer was built byPascal. and a better device was built by Leibnitz in16""3. Charles Babbage designed the tirst largecomputer, the difference engine, in England in 1812,to calculate tables of mathematical functions. In1833. he planned the analytical engine. the forerunnerof modern general-purpose computers. Although thismachine was never constructed. its design innovation

Ihe preparation of this paper and some of the data in Secuon IIwere supported In part by Grant MH 16.496 to the tirst author. Thetinaucial data for Secnon III were provided by the Nationallnsuuue of Mental Health. The authors thank Betty Pickett. JohnHammack. Nllcs Bernick. Anne Bergquist. Jane Dav. HelenSchcuch, and Ethel Roscn of the National Institute of MentalHcalth for adruinrvtrauve and technical help. We thank DonNorman. Doug Ohman. and Elliott Epps for helpful comments onearher dratt-, Reprint requests should be sent to Dr DorisAaronson. Psychology Department. Room 858. (, WashmgtonPlace. NCII York Umvcrvuv. New York. Ne\\ York lOOOJ.

\\ as the use of preprogrammed sequential control ofarithmetic operations. About SO years later. in 1889.Dr. Herman Hollerith patented the Hollerith punchcard. \\ hich IBM used when it was founded in 1911.As psychologists, we should note Hollerith's earlyuse of punched data with an automaticpreprogrammed machine to tally social andbehavioral census data.

The next set of breakthroughs occurred after a 30­to 40-year gap. In the 1930s, Dr. Howard Aikendescribed the tirst modern preprogrammed computer.the Mark I automatic sequence controlled calculator,completed in 1944 at Harvard University. It usedpunched paper tape input and had a memoryconsist ing of thousands of electromagnetic relays.T \\ 0 years later. the ENIAC (electronic numericalintegrator and computer). designed by J. P. Eckertand J. W. Mauchly. was completed at the Universityot Pennsylvania. Its important contribution wasoperating speed. achieved by electronic internaloperation. T\\ 0 additional hardware innovations weremade in 1945 by Dr. John Von Neumann. Heproposed to use binary rather than decimal numbersand to store the program instructions in memory.Until that point. control sequencing was external:dials. cards. paper tape. or plugboards. Memory wasused only to store data. By 1949, the EDVAC(electronic discrete variable automatic computer) andthe EDSAC (electronic delay storage automaticcom puter) \\ ere bu ilt using Von Neumann'sprmcrples. as well as memory made with ultrasorucspeed mercury-delay line".

DUI'Illg the next decade. in tact. during the very129

Page 2: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

130 AARONSON, GRlJPSMITI-i, AND AARONSON

IsMTIM [

next year. computers were applied in cognitivepsychology. In 1%0. Dr. A. M. Turing wrote a paperdiscussing thought. intelligence. and learning. andthe differences between man and machine in theserespects. The marriage between artificial intelligenceand computer simulation in cognitive psychology wasmanifested in A. O. Oettinger's 1952 paper."Programming a Digital Computer to Learn." Thatpaper presents a computer simulation of "a smallchild sent on a shopping tour" that incorporates manyof our current theoretical concepts of informationprocessing. as well as much of our current jargon todescribe this processing. First. Oettinger dividedEDSACs memory into two parts: "one to play the roleof the experimental subject .... and the remainderto serve as an extension of the outside (stimulus)world." He then proceeded to describe psychologysimulation "experiments" Oil this "child machine."This simulated child could "memorize" an inputshopping list and later recall or "recite" it on theteleprinter. Further. it could translate the shoppinglist into behavioral acts. At tirst. the "child machine"shopped for items in its path of stores in a randomtrial-and-error fashion. However. the "childmachine" could learn in live different ways. It couldstore information about accessed shops for use onsubsequent trips: it was capable of incidental learningabout items it passed but didn't buy: it couldrearrange its memory into a more meaningfulorganization: it could determine which memory­scanning methods were appropriate for particularsituations: and finally. stored items could be recededto make direct-assess retrieval possible in addition tosearch processes. Realistically, the "child machine"achieved the ultimate measure of human capabilitiesby acquiring subroutines for memory-decay andforgetting.

The next set of major developments in simulatinghuman cognitive processes came from the laboratoriesat Carnegie-Mellon University. In a series of papersbeginning in 1955, Newell, Shaw, and Simon con­ceptualized some theoretically important heuristicprocedures in their programs that solved logicproblems and played chess. This significant researchon thinking continues today at Carnegie inever-expanding and creative ways. Their major recentcontribution is methodological: the development ofgeneral-purpose simulation programming languages

Figure I. A record of events which. in ouropinion, were critical to the development ofcomputers and to their applkation incognhlve psychology.

for cogruuve processes. Such general-purposelanguages. as opposed to iud ividual programs. givetheorists a new degree of tlexibility and speed inimplementing their ideas.

Also. during the 19S0s. the on-line real-timecomputer-based laboratory was being developed. Atfirst. these laboratories used analog computers. but1)\ the mid-I %Os. digital computers had taken over.III the early 1450s. Dr. Franklin V. Taylor of the U.S.Naval Research Laboratory developed one of the first(analog) computer-based cognitive research labs(Chernikoll. Birmingham. &: Taylor. 1955:Birmingham \..\: Taylor. Note I). Their computerswere used III generate moving visual stimulus targetshaving complex and dynamically changing attributesof position. velocity. acceleration. brightness. andtemporal blanking. Stimulus changes could beprcprngrummcd. or else could be a function of thesu bject 's pu rsu it or compensatory track ingperformance. Once the subjects' abilities andlimitations were understood. their performance couldbe improved bv providing them with immediate andongoing feedback. and by letting the computer take(ncr vomc of their cognitive activities. In particular.marked improvements were obtained if the computertook mer some of the process of integrating thestimulus information mer time. and also if thecomputer helped the subject estimate and predict thestimulus target path several seconds or severalminutes ahead of time. Based on that resarch ,computers now aid ship pilots by performing thefollowing functions: unburdening. quickening.feedback. and feedforward.

In the mid-I %()s. the tirst major digital computerla bs for cogn it ivc psvc hology were being developed.The labs were built around various machines.including the IBM-IKOO. the TX-2. the DigitalEquipment POP-I and 4. and the HoneywellOOP-Ilo. These early labs included Miller, Norman.and Bregman at Harvard. Restle at Indiana. Uttal atMichigan, Suppes and Atkinson at Stanford. Hayesand Rubenstein at Hanscom Field. Ynterua at LincolnLabs. Licklider at Bolt. Beranek & Newman, Swets atMIT. and Ne\\e11. Simon. Green. and Gregg atCarnegie. These early labs were extraordinarilyproductive for t\\O reasons. First. they were dedicatedto turning out high-quality psychological researchrather than to developing bigger and better computer

Page 3: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

COMPUTERS AND COGNITIVE PSYCHOLOGY 131

COI1PUTER GENERATEDRANDOM-DOT PATTERNS.WHEN THE 2 IMAGES AREVIEWED IN ASTEREO­SCOPE, THE 3-D IMAGEBELOW IS SEEN.

g,

CA'ndI-'" ~

...1... 'A~[~

..,.(IUI "0""0'1(.,1 ...1 1'4".1(,-,.1",,) • ,01 UqIP I

-._·fO-DIIIT. _.ft.. .lU,IU1 "tlt,1",:-tl",! '''''&..Il''''' 11 1

ON-LINE STIMULUS CONTROL &EYEMOVEMENT RECORDING SYSTEM.

Figure 2. Left: Stimuli for stereospecific depth perception experiments (Julesz, 1971). Right: Flow chlrt for computer systemthat records eye movements and modifies a CRT display contingent on the subject's eye movements (Loftus et aI., 1975).

systems. For example. Miller. Norman. and Bregmanran their first subject in the Grammaramapsycholinguistic experiment the day after theircomputer was installed. Second. many of these labswere quick to develop general-purpose programs andprogramming languages for cognitive research at atime when other labs were writing individualprograms anew for each separate experiment. Again.Harvard's Center for Cognitive Studies provided agood example. In that lab. by 1966. Forsyth'sLexigraph Interpreter had enabled 20 new computerusers to run cognitive experiments using auditory andvisual displays and to get their experiments workingin a few weeks without acute trauma or chroniccomputeritis.

A., a last historical comment. it is of interest to notethe point at which we began formal computereducation for graduate students in cognitivepsychology. Such education began only a few yearsafter cognitive psychologists began to use computersin research. It is appropriate to mention a few texts\\ hich appeared early on and which had a majorimpact on training researchers in cognitivepsychology. In 1960. Miller. Galanter, and Pribrampublished Plans and the Structure of Behavior. Thisbook incorporated conceptual ideas about computerfunctioning as theoretical constructs in psychologicaltheories of information processing. In 1961, Hollandand Skinner published The Analysis of Behaviorwhich provided much of the conceptual foundation

for research in computer-aided instruction andhuman learning. In 1963, Bert Green wrote DigitalComputers in Research, the first text for behavioraland social scientists on all aspects of computer usage.Also, in 1963, Feigenbaum and Feldman editedComputers and Thought which provided excitingexamples of computer simulation in perception.memory, learning, decision making, concept formation,and problem solving. Finally, in 1967. William Uttalpublished Real-Time Computers which served as themajor text for training students to use computers as thecentral component in the cognitive research laboratory.That brings our historical review of computers in cog­nitive psychology almost up to the present.

QUALITATIVE EXAMPLESOF COMPUTERUSAGEIN COGNITIVE RESEARCH

We feel that computers have aided research in threedifferent ways: (l ) they have increased the ease andtlexibility with which we can do research. (2) theyhave increased the precision and the reliability of ourexperiments. and (3) they have made possible variousexperimental procedures that were otherwiseimpossible or almost impossible. To illustrate thesepoints. let us examine a few examples from each ofthree areas of cogni tive research.

Figure 2 illustrates two ways in which computersare used in perception research. On the left is an

Page 4: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

132 AARONSON, GRUPSMITH, AND AARONSON

SCHEMATIC SPECTROGRAMS OF LIQUID STIMULI

SO 10,) JOO

.j'....c .

••

,i

A 1524• • 16A9C • 2180D : 2S2SI : 2862

50 100 250

"1 3

~

a:w '~

g,~~~;;,

.9V)

§2 .8ou~ .7z:

:: .6:IE:

>- .5'-"z:C> .4«:wcc .3

r

/ra!m/-/la!mf frou/-/lou/

TIME «nsec.i

THEPOWER ­

NEWLY DESIGNED WHOSE LARGE ROTA~Y--

/"0--..0--~- -<>-~-<>--~-~- -o-> J> --<),."0- -tt-o-- .(;_~_-oCOMPREHENSION

I I I I I I I I I I ! I ! I I I

Figure 3. Top: Representation ofcomputer-generated synthesized speechstimuli (Cutting, 1975). Bottom: Word-by­word reading times for computer-eontroUedpsycholinguistics study (Aaronson &Scarborough,1976).

2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17

SERIAL POSITION IN THE SENTENCE

example of random dot stereograms developed byBela Julesz (\ 971). The computer generates thesematrices according to programmed conditionalprobability rules such that either matrix aloneappears random and two-dimensional. However."hen the two are viewed together stereoscopically, athree-dimensional pattern emerges. Such matriceshave prm ided a valuable tool to study the role ofvarious stimulus factors in depth perception. to shedlight on several theoretical issues concerning thenature of stereopsis. and to establish the relationshipbetween pattern perception and depth perception.Recently. Julesz has prod uced stereograms that movedynamically. in order to study the relationshipsbetween motion and depth perception. Thestereograms provide a research tool that is unique.and the generation of these stimuli involves so muchcomputation. as well as spatial and temporalprecision in the displays. that it is really not possibleto produce them without a computer.

On the right of this tigure is a tlow chart of acomputer-controlled visual display and eye-movementrecording system developed by Geoffrey Loftus(1975). Prior to the use of computers. eye-movementscould be recorded on tilm or other devices and lateranalyzed manually. But. these noncomputer methodsof analysis are so tedious. time-consuming. and opento various types of error. that it is simply not possibleto run experiments that require the collection of verymuch data. Further. the computer system in thisligure makes it possible to alter the stimulus displayon line. contingent on various aspects of the subject'seye-movement patterns. Loftus and others havealready made important empirical and theoreticalcontributions to the study of pattern and motion

perception that would not be possible withoutcomputers.

Figure 3 illustrates two uses of computers inpsycholinguistics. The top portion represents a speechspectrogram painted for use in the speech synthesizerprograms of Cutting (\ 975) at the Haskins Laboratoryin Connecticut. More recently. researchers have beenproducing speech by incorporating electronic speechsynthesizers directly into the computer hardware (asat the Bell Laboratories). and also by using programsto edit natural tape-recorded speech; tor example. toprovide uniform compression or expansion. or toselectively delete vowels. peaks. particular frequenciesor transients. Comparing spectrographs of real andsynthesized speech has yielded theoretical insightsregarding the nature of speech production.

On the bottom of this tigure are some data collectedfrom a computer-controlled subject-paced readingexperiment in our laboratory (Aaronson &Scarborough. 1976). The computer recorded theword-by-word reading times for subjects readingsentences for later recall or comprehension tests. Thetigure shows prolonged pauses at phrase boundariestor the recall subjects but not tor the comprehensionsubjects. The on-line real-time interaction neededbetween the subject. the displays. and the timer. plusthe resulting volume of data would make it prohibitiveto run this type of study without a computer.

Figure 4 illustrates the use of computers in somerecent memory and learning experiments. The leftside of this tigure illustrates one of Patrick Suppes'(1968) projects. A first-grade pupil is shown twopossible answers to an arithmetic problem displayedat the top of the computer oscilloscope. As hewatches, verbal instructions are communicated

Page 5: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

COMPUTERS AND COGNITIVE PSYCHOLOGY 133

.. 20

170

~ 60""~ 50

~ 40...Jco:~ 3D

6,P- _0,

0""/ "

POST-L1ST//~FEEDBACI(,/ •,,-

,,-."-" ---.:I NSTANTNlEOUS

I LETTER FEEDBACK10 I

3 4 5 6SERIAL POSITION

Flgure 4. Left: A pupil solving arithmetIcproblems in a computer-aided Instructionproject (Suppes, 1968). Right: Recall timesand errors for a computer-controlled serialrecall task in which the nature and delay offeedback were varied (Aaronson &Grupsmith, Note 21.

through earphones and he signals his choice of answerto the computer with the machine's light pen. or hemight type a number on the keyboard. This computersystem makes possible individualized instruction, asthe arithmetic problems are selected by branchingsubroutines based on the pupil's past learningperformance. The large computer memory can storeboth a large body of curriculum material and detailedlearning histories for a large number of students.Individual performance information is given to thepupil in an interactive system as immediate feedbackafter each problem. and statistics on both individualand class performance are computed for the teacher.On the right of this tigure are some data from our owncomputer-based learning lab. In multi-trial serialrecall tasks. subjects received two kinds ofinformation feedback from the computer. Somesubjects were shown how many letters they recalledcorrectly. immediately after they typed the entire liston each trial. Other subjects heard a tone thatsignaled accuracy information instantaneously. letterby letter, while their fingers were still typing eachindividual key. Such instantaneous reinforcement notonly improved subsequent learning of the reinforcedletters, it also changed the subject's cognitive strategyfor coding the entire list. As the time and error dataon this slide show, instantaneous reinforcementencouraged subjects to chunk the verbal items intosubgroups. But. subjects with "normal" postlistreinforcement showed far less organized performance.A computerized teacher was absolutely necessary toanalyze the subjects' errors on-line and to provideinstantaneous reinforcement to improve learning inthese ways.

Our discussion so far has shown that computers areinvolved in all phases of psychological research:(I) experimental design, enabling new types ofparadigms and complex counterbalancing and

randomization of conditions. (2) generating. co­ordinating, sequencing. and calibrating stimulusdisplays for various sensory modalities. (3) collectingresponse data for discrete and continuous responsesas well as temporal information about thoseresponses, (4) performing data analyses from both on­and off-line experiments including descriptive trendsas well as statistical tests, and finally (5) theorydevelopment in the form of simulation models,parameter estimation. and goodness-of-fit tests foralgebraic models. as well as contributing conceptualideas for information-processing theories of cognition.

QUANTITATIVE INFORMATION ONCOMPUTER USAGE IN COGNITIVE RESEARCH

The examples we have considered lead us to asksome further questions about the full impact ofcomputers on cognitive psychology. How typical arethese particular examples? To what extent arecomputers really being used in cognitive psychologytoday'! How does computer usage differ among thevarious sub-areas of cognitive psychology: perception,memory. learning. and complex thought processes?In what ways do research costs differ betweencomputer-based and noncomputer projects? Toanswer these questions. we examined data on researchfunding from the National Institute of Mental Health(NIMH), In particular. we used the budgetinformation from proposals funded during the past 9years from two NIMH sections that handle many ofthe cognitive grants: the Personality and CognitionSection and the Experimental Psychology Section ofthe Behavioral Sciences Research Branch. We shouldbe cautious about drawing generalities from the datawe will see in the next few slides. as they do notinclude many other agencies that fund cognitiveresearch. such as the National Science Foundation.

Page 6: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

i34 AARONSON, GRUPSMITH, AND AARONSON

Figure S. Funding trends during the past 9years for computer- based and noncomputercognitive research projects assigned to twoNIMH review sections. (AI The number ofgrantees funded. (B) The mean size ofIndividual grants. (C) The total amount offunds awarded to all grants In each category.

.0

.I

.4

!'iAN .RANl 1m t1THWSNlDll5i.J tHl

10ClJIlIlTER

10

,010

MI. IJ' WIlTEES! tAl

61 &8 69 10 II 12 II 14 " ., 6d ~9 10 11 12 II 14 IS LI 6S 69 10 11 12 II 14 15

FISCAL YEAR

other programs within NIMH. and the NanonalInstitutes of Health. as well as private foundations.and direct university support.

To provide quantitative analyses of computer andnoncomputer cognitive research, we used three typesof data. (1) The NIMH key-word abstracts describingthe research projects were used as a basis forclassifying projects into cognitive and noncognitiveresearch. and then into four sub-areas of cognition.For these analyses. our definition of cognition wasrestricted to research with human subjects. Two of usindependently categorized these abstracts. and ouragreement was high: we agreed 97% of the time in thecognitive/ noncognitive sorting. and 93 % of the timein classifying projects as focused primarily onperception. memory. learning. or complex thoughtprocesses. 1 We defined this last category. "thinking."to include problem solving. decision making. conceptformation. and psycholinguistics. (2) We used theyearly budget sheets. along with the abstracts. inorder to classify the projects as computer ornoncomputer for any given year. Computer projectswere identifiable by budget items such asprogrammers' salaries", computer hardware. sup­plies. service contracts. and machine time costs. Welabeled as computer-based research those projectsthat used computers heavily for functions other thanroutine statistics. e.g., in generating stimuli, incontrolling on-line labs. and in doing algebraic andsimulation model development. (3) We used theaward notices along with the budget sheets. in orderto compute the amount of funds awarded for varioustypes of research costs such as personnel orequipment.

Figure 5 compares the trends over the last 9 yearsfor computer and noncomputer cognitive research.Panel A indicates that there has been a strong andcontinuing shift in the number of grantees? doingcomputer and noncomputer research. The top line onthe graph indicates that the total number of granteesremains roughly constant throughout. so this shiftrepresents a true trade off. Over the past 9 years. theproportion of grantees heavily involved withcomputers increased from 15% to 46% of the totalnumber of (funded) grantees. In Panel B. the average

size of an individual grant. in thousands of dollars,has been graphed. We see throughout that the typicalcomputer-based project costs a good bit more than anoncomputer project. The cost differences narrowedsomewhat between 1967 and 1973. but appear to bediverging again. An inspection of more detailed datasuggests that this recent divergence might beattributed to increases in the number of computerperipherals (e.g., display terminals) and in theincreased costs of service contracts. rather than toprogrammers' salaries or computer supplies. Panel Cshows the total number of dollars (in millions)awarded to computer and noncomputer projects."There is a strong and consistent shift fromnoncomputer to computer funding. For this sample.over the past 9 years. the proportion of awardeddollars for computer-based research increased from28% to 57% of the funds spent on cognitive research.

Figure 6 compares the trends in computer-basedresearch among the four sub-areas of cognitivepsychology. The Y axes for all of these graphs reflectthe percentage of computer-to-total (computer plusnoncomputer) data within each sub-area of cognition.There is a pair-wise split: memory and learning showsimilar trends, and perception and thinking appearmore similar to each other. In Panel A. memory andlearning start with a higher percentage of computergrantees but end with a lower percentage thanperception and thinking. Memory and learningincreased their computer grantees from 20% to only30%. while perception and thinking went from 10%to 60%. Panel B shows that computer grants in'memory and learning cost a good bit more thannoncomputer grants. But, in perception and thinkingfor the past 5 years. the grant costs have been almostthe same for computer and noncomputer projects; thecurves are close to the 50% line. This cost differenceholds on an absolute basis also: during the past 5years. computer-based grants in memory and learninghave averaged $35.000. while noncomputer grants inthose areas. as well as all grants in perception andthinking have been about $25.000. Panel C shows thepercentage of the total awarded funds in eachsub-area going to computer. relative to noncomputer,research. Again. perception and thinking show the

Page 7: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

COMPUTERS AND COGNITIVE PSYCHOLOGY 135

PROPORTIONS OF TOTAL COGNITIVE SAIlPLE THAT ARE COMPUTER RELATED [lIC+N)

75 67 68 69 70 71 72 73 74 75 67 68 69 70 71 72 73 74 75FISCAL YEAR

1.

.7

.6

.5

(A) NO, OF GRANTEES1.

,1

(B) ~EAN GRANT SIZE1.0

.9

,8

I[) TOTAL AWARDED I

figure 6. A comparison of funding trendsin computer-based and noncomputer researchamong four sub-areas of cognitive psychology:perception (Pl. thinking [T], memory (M) •learning [Ll. (Al The proportion of granteesthat are computer-based. (B) The ratio of themean computer grant size to the sum of themean computer and noncomputer grant sizes.(e) The proportion ofthe total funds awardedto cognitive projects that went tocomputer-based cognitive projects.

PERCEPTION (Ploo--.o THINKING (!)b--t, ~FMORY (~).--. LEARNING eu o..

Figure 7. The percentage of the average grantee's budget spentfor personnel f P], equipment IE). supplies IS), and other ,01 costsfor computer-based and for noncomputer projects in each of foursuh-areas of "ognithe psychologj .

'JES PJE;PERCEPTION 'HINKING

lDI NON-CIJ/'IPUTER

POE ;~E~OPY

o COMPUTER

;0

systems development. hardware and software. arerelevant to research in all sub-areas of cognitivepsychology. For example, these papers might deal withgeneral-purpose interface circuitry or with user­oriented programming languages. On the right aremore special-purpose papers: statistical programs.theoretical papers dealing with simulation and modeltesting. and papers dealing with specific types ofexperiments in one or another sub-area of cognitivepsychology. The experimental papers included bothhardware and software development-for example,programs to generate random-dot stereograms orcircuitry to record eye movements on line. Both thetop and bottom graphs tell the same' logical storyabout computer developments over the past severalyears. and the data are probably relevant to all ofexperimental psychology in general. In the earlieryears. most of the papers involved systemsdevelopments. At first most reports concernedcomputer hardware. as many of us were building newcomputer-based labs at the end of the 60s. Once thecomputers were working and able to control on-lineexperiments. we discovered the advantages ofdeveloping general-purpose languages rather than

greatest increase over the past 9 years, while memoryand learning remain somewhat more stable.

Figure 7 shows how grant funds are distributedamong expense categories. We have graphed thepercentage of the average grantee's budget spent forpersonnel. equipment. supplies, and other expenses.The category "other" on NIMH budget formsincludes subject fees, equipment maintenance costs.and computer time. The four categories in Figure 7constitute 97% of the grantee's budget. Theremaining 3% were omitted from this analysis andinclude the costs of travel. consultants, publications,and renovations, which were all about equal amongcomputer and noncornputer projects. The data arestrikingly consistent. For all sub-areas of cognition,computer-based researchers. indicated by open bars,spend proportionately less for personnel and more forthe other three categories, than their noncomputercolleagues. This graph indicates a clear man-machinetrade off. which holds on an absolute as well as arelative basis. Computer-based projects spend fewerdollars on people and more dollars on machines, theirsupplies. and service costs. This picture is actually anunderestimate of the trends. as many computer labsin our sample are also supported by university fundsand by equipment grants from other agencies. Thesedata on NIMH research funding hold an importantplace in any analysis of the impact of computers oncognitive psychology.

Now that computers have invaded our cognitivelabs. what have we been doing with them over the pastseveral years'! To partially answer this question. weused as a data base the computer articles of BehaviorResearch Methods & Instrumentation since itsinception in September. 1968. Two of us tagged thearticles broadly relevant to cognitive psychology andthen sorted them into five piles as to content. Again.our agreement was high. 93% for all stages of sortingcombined. Figure 8 shows the results of thistabulation. The top and bottom graphs show thenumber and the percentage of papers in tivecategories. On the left. the two categories of general

Page 8: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

136 AARONSON, GRUPSMITH, AND AARONSON

ADVANTAGES AND DANGERS OFCOMPUTERS IN COGNITIVE RESEARCH

DangersSo far we have focused on how computers have been

used in research, with an optimistic view of theirsuccesses. For a moment, let us examine four classesof problems that have been encountered in a numberof computer-based labs: money, time, systemsfunctioning, and experimenter judgment.

Money. Although the cost of computer systems hasbeen decreasing over the past few years because of

engineering developments and market demands, acomputer lab is still a good bit more expensive thanmost labs with more traditional equipment. Inadditional to the hardware, companies are nowselling. rather than giving away, their systemssoftware. Environmental modifications for improvedtemperature and humidity control are often neededand are costly. and programmers' salaries must beadded to research costs. Although most hardwarecosts are decreasing, repair and maintenance costs arerising sharply. as are the costs of interfacing stimulusdisplay and response devices. Some labs have made amajor error in purchasing a computer from aone-time-available source of funds and then not beingable to support it tinancially.

Time. Four types of time problems areencountered. The time to learn to use the machine isgenerally far longer than for more traditionalequipment. The time to write the initial experimentalprograms and user languages can be long, and oftenexceeds a year. Down-time is often long, and generallynot under the user's control. Finally, the faet thatcomputer labs are often shared among researchersmeans that available computer time may be limited.

Systems problems. When more traditional labequipment is not working properly, we generaIly spotit immediately and clearly, but computer hardwareand software failures often take a long time to detect.There are at least two types of systems problems thatare very insidious. First. company-supplied softwareand/or hardware may not function in the waydescribed by the manuals. Random-numbergenerators may not be very random, andprogrammable clocks mayor may not count theovertlow bit when timing displays and responselatencies. Statpacks and experimental programsobtained from other researchers may have all mannerof undetected "bugs," and programs that run well onone machine may have different timing and reliabilityproperties on another machine. A second class ofsystems problems arises from the fact that equipmentfailures often occur gradually, in small amounts. or atinfrequent intervals. making their detection difficult.This is particularly true for analog equipment and formany peripheral 110 devices.

As psychologists, we are particularly vulnerable tothese classes of problems. Somehow many of us have areligious belief in the infallibility of computer systems,but man frequently develops beliefs about things hedoes not fully understand! Coupled with this faithgoes a negligence in checking and calibrating systems

. at frequent intervals. as we would do with morestandard lab equipment.

Experimenter judgment. A computer is quitecapable of leading users astray from doinghigh-quality psychology research. In some cases. thepsychologist becomes intrigued with the intellectual

SPEC IAL PURPOSE

STAT,0

(

//

//

~/T'. / HEORY

6S~72 73-74-74

68-70

68-70

6 \

\ ~RDWARE

\

GENERAL SYSTEMS

25 °U) 20 HARDW~~E_ !<X ). Y-o~15AOQ.. "l~ 10 SOFTWARE

~ 5

STATEX~1 >~. ,,/' /~

0" ./0 ",/~./rlll:ORY

71-72 73-74 63-70 71-72 73-711ISSUES OF B. R, 11, I.

Figure 8. The number (top) and the percentage (bottom) ofBRMI papers in each of five categories relevant to computer-basedcognitive research. For each 2-year period, the percentages for allfive categories combined sum to 100'\'0, even though two categoriesare graphed on the left and three on the right.

writing separate programs for each individualexperiment. More recently, however, cognitivepsychologists have been developing statistical andtheoretical programs to analyze and interpret the datadirectly on their laboratory computers, or over thenewly installed lines connecting these labs to the largeuniversity computer centers. Recent special-purposeexperimental papers frequently deal with complexmethodology needing quite an array of peripheralequipment. such as eye-movement recorders or speechsynthesizers.

The data reported from the two NIMH sectionssuggest that various governmental and privateinstitutions have equipped a number of cognitivepsychology labs with sophisticated computerhardware. The sample of articles in BRMIcomplements the funding data. They show that wehave learned how to use that hardware in all phases ofour research from experimental design, to on-linecontrol. to data analyses and model building.

Page 9: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

COMPUTERS ANDCOGNITNE PSYCHOLOGY 137

capabilities of his machine and spends hundreds ofhours designing programs that are far more elegant.sophisticated. and Ilexible than his research will everrequire. In the process. these psychologists becomenewly converted programmers instead. There is asecond way the psychologist's judgment all too oftenfails. Once set into action. the computer is quitecapable of running at least twice as manyexperiments. twice as fast. as the psychologist could.do alone. Thus. ·the computer misleads the researcherinto detining productivity in terms of number ofexperiments run, rather than number of good ideasabout important theoretical issues in cognitivepsychology.

AdvantagesComputers improve our research capabilities

significantly over the more traditional laboratoryconligurations.

Precision and reliability of experiments. The use ofcomputers has increased precision and decreasedvariability in several ways: (I) Equipment variabilityis reduced far below that for slide projectors. memorydrums. and cam timers. The resolution of computer(crystal) clocks and the time to activate solid stateswitches or to execute an instruction are generally afew microseconds, and the reliability is one or moreorders of magnitude better than this. (2) Because theexperimenter and his differential social interactions canbe removed, a fully automated laboratory allowsreductions in between-subject variability. (3) Subject­paced trials, immediate feedback, and active on-line par­ticipation help raise the motivational level andconsequently decrease within-subject variance. Further,within a session, a smooth-running automated systemoften permits a 50% larger data sample than doesstandard equipment. (4) Finally, variability due toexperimenter errors, e.g., in setting knobs and recordingdata, are eliminated.

Ease and flexibility of doing research. Flexibility inexperimental design occurs because event sequencing(c.g .. feedback and stimulus ordering) and stimulusattributes (e.g .. quantity. duration. intensity) caneasily be varied under program control or can bemade contingent on the subject's response andlatencies. This flexibility allows rapid pilot work todetermine the optimal experimental conditions andpermits easy parametric variation within and betweenstudies.

Research capabilities not possible without the useof computers. New types of stimulus displays can nowbe generated. as illustrated by Juleszs stereogramsand Cutting's synthetic speech. New types ofexperimental paradigms are possible. as illustrated bySuppes'. and by our own. real-time interactivelearning labs. We can now collect and analyze largerquantities of response data. and this quantitativedifference is of such a magnitude that it has made a

qualitative difference in the types of research designsused, as illustrated by Loftus' work with eye-movementdata, and Aaronson and Scarborough's work withreading times. Finally, and most important, thecomputer as a research tool, has enabled us to revealso III I..' new psychological phenomena that were notpreviously obvious. as illustrated by Sperling's (1971)demonstration that people can scan information intomemory at rates as high as 125 items/sec and byJulesz's demonstration that people are capable ofexperiencing stereoscopic depth perception prior tothe existence otmonocular pattern perception.

Our conclusion is that the impact of computers oncognitive psychology has been very rapid and verystrong. Computers have had a significant impact onresearch in cognitive psychology: They have increasedthe ease and the flexibility with which we can doresearch; they have increased the precision and thereliability of our experiments; and they have madepossible various experimental procedures that werepreviously impossible. A most important question,however, is whether or not computers will markedlyincrease the rate at which we observe new psychologi­cal phenomena, the rate at which we acquire newinformation about mental processes. and the rate atwhich we gain a theoretical understanding of howthose cognitive processes are performed by the humanbeing. Perhaps. by the year 2.000. at the 31st AnnualMeeting of our On-Line Computer Conference inPsychology. we will have a partial answer to thisquest ion.

REFERENCE NOTES

I. Birmingham. H. P.. & Taylor. F. V. A human engineeringapproach to the design of man-operated continuous controlsystems. Naval Research Laboratory Report 4333. April 7. 1954.

2. Aaranson & Grupsmith. unpublished manuscript.

REFERENCES

AARONSON. D.. & SCARBOROUGH. H. Performance theories forsentence coding: Some quantitative evidence. Journal ofExperimental Psychology: Human Perception & Performance.1976.2.56-70.

CHERNIKOFF. R.. BIRMINGHAM. H. P.. & TAYLOR. F. V. Acomparison of pursuit and compensatory tracking underconditions of aiding and no-aiding. Journal of ExperimentalPsychology. 1955. 49. 55-59.

CUTTING. J. E. Aspects of phonological fusion. Journal ofExperimental Psychology: Human Perception & Performance,1975. 104. 105-120.

FEIGENBAUM. E. A.. & FELDMAN. J. Computers and thought. NewYork: McGraw·Hill. 1963.

GREEN. B. F. Digital computers in research. New York:McGraw-Hill. 1903.

HOLLAND. J. G.. & SKINNER. B. F. The anulvsis ofbehavior, New York: McGraw·Hill. 1961.

JULESL. B. Foundations (!f cyclopean perception. Chicago:The University of Chicago Press. 1971.

LOFTUS. G.. MATHEWS. P.. BELL. S.. & POLTROCK. S.General software for an on-line eye-movement recordingsvstcm. Behavior Research Methods & l nstrumentution . 1975. 7.201-204.

Page 10: Behavior Research Methods Instrumentation SESSION · PDF fileBehavior Research Methods & Instrumentation 1976, Vol. 8 (2),129-138 SESSION VII CONTRIBUTED PAPERS: EVALUATIONS OF ON-LINECOMPUTER

138 AARONSON, GRUPSMITH, AND AARONSON

MCCRACKEN. D. D. Digital computer programming. New York:Wiley. 1957.

MILLER. G. A.. GALANTER. E.. & PRiBRAM. K. H. Plansand the structure of behavior. New York: Holt. 1960.

NEWELL. A. The chess machine. Proceedings of the 1955 WestemJoint Computer Conference Session on Learning Machines.March 1955. 85-111.

NEWELL. A.. SHAW, J. c.. & SIMON. H. A. Elements ofa theory of human problem solving. Psychological Review. 1958.65. 151-166.

NEWELL. A.. & SIMON. H. The logic theory machine.IRE Transactions on Information Theory. IT-2. 1956.3. 61-79.

NEWELL. A.. & SIMON. H. GPS. a program thatsimulates human thought. In H. Billings (Ed.), LemendeAutomaten (proceedings of a conference at Karlsrube,Germany. April I%\). Munich: Oldenbourg, 1961, 109-124.

OETTINGER. A. G. Programming a digital computer to learn.Philosophical Maguzine, 1952. 43. 1243-1263.

SPERLING. G. Extremely rapid visual search: The maximum rateof scanning letters for the presence of a numeral. Science.1971. 174.307-311.

SUPPES. P. The uses of computers in education. InMuth emutical thinking in behavioral sciences. San Francisco:Freeman. 19bb.

TL·RING. A. M. Computing machinery and intelligence.Mind. 1950.59.433-460.

UTTAL. W. R. Real-time computers. New York: Harper & Row.1967.

NOTES

I. Only regular research proposals were included in our sample;programmatic proposals and small grants were excluded from ouranalyses.

2. Salary information was available in terms of positions orfunctions of grant personnel. but individual identifying informationwas deleted from the budget sheets available to us to protect theprivacy of the investigators.

3. We are using the term "grantee" to denote the individualprincipal investigator responsible fOt the scieruiiic research of anawarded grant. This differs from the technical terminology atNIMH. where "grantee" denotes the sponsoring institution. e.g ..the universitv at which a funded investigator is employed.

4. The data in Panel C are pooled (summed) over all grantees.regardless of area within cognitive psychology. The data in Panel Bare arithmetic averages over the four sub-areas in cognition. witheach sub-area weighted equally (not weighted by sample size).Hence. the data in Panel C are not the simple product of Panels Aand B (which would be the case if the Panel B data were weightedrather than arithmetic means).