Upload
others
View
4
Download
0
Embed Size (px)
Citation preview
DOCUMENT RESUME
ED 266 318 CE 043 634
AUTHOR McKinney, Floyd L.; Kohan, AlanTITLE A Design and Assessment of a Formative Evaluation of
the Principles of Technology Curriculum Materials.INSTITUTION Ohio State Univ., Columbus. Notional Center for
Research in Vocational Education.SPONS AGENCY Office of Vocational and Adult Education (ED),
Washington, DC.PUB DATE 86CONTRACT 300-83-0016NOTE 146p.PUB TYPE Reports Evaluative/Feasibility '142)
EDRS PRICE MF01/PC06 Plus Postage.DESCRIPTORS Case Studies; Change Strategies; *Curriculum
Development; Educational Research; EvaluationCritsria; *Evaluation Methods; Field Tests;*Formative Evaluation; *Instructional MaterialEvaluation; Instructional Materials; Interviews;Program Improvement; Questionnaires; *ResearchDesign; Research Methodology; Secondary Education;*Technical Education; Technological Literacy
ABSTRACTThe National Center for Research in Vocational
Education (NCRVE) wag consulted to improve the formative evaluationdesign for the Principles of Technology (PT) curriculum materialsdeveloped by the Center for Occupational Research and Development(CORD) and the Agency for Instructional Technology (AIT). Because theNCRVE entered the formative evaluation effort at midpoint in itsevaluation time line, the formative evaluation activities beingconducted by the curriculum developers were reviewed and minorchanges were suggested. It was recommended that (1) case studies beused to provide the curriculum developers with a greater depth ofinformation about the setting in which local schools were attemptingto implement PT, (2) future formative evaluation efforts should bedesigned concurrently with the development of the curriculum and thedevelopment of strategies for field testing it, (3) the -nrpose andnature of formative evaluation should be thoroughly explained to allconcerned parties, and (4) extensive communication should bemaintained between curriculum developers and users. (Appendixes tothis report include a membership list of the c-ntent review team, thefirst-year evaluation instruments, the formative evaluation timeline, the approach for the second-year formative evaluation timeline, the approach for the second-year formative evaluation that hasbeen suggested by the NCRVE, the second-year instruments use '31, CORDand AIT, and an interview assessment guide.) (MN)
************************************************************************ Reproductions supplied by EDRS are the best that can be made ** from the original document. *
*******************************************************************c***
THE NATIONAL CENT: `4 1-AISSION STATEMENT
The National Center for Research in Vocational Education's mission is to increasethe ability of diverse agencies, institutions, and organizations to solve educationalproblems relating to individual career planning, preparation, and progression. TheNational Center fulfills its mission by-
O Generating knowledge through research
Developing educational programs and products
Evaluating individual program needs and outcomes
Providing information for national planning and policy
Installing ed..lort:ohal nrograms and products
Operating information systems and services
O Conducting leadership development and training programs
For further information contact.
Program Information OfficeNational Center for Research
in Vocational EducationThe Ohio State University1960 Kenny RoadColumbus, Ohio 43210-1090
Telephone: (614) 486-3655 or (800) 848-4815Cable. CTVOCEDOSU/Columbus, OhioTelex. 8104821894
A DESIGN AND ASSESSMENT OF AFORMATIVE EVALUATION OF THE
PRINCIPLES OF TECHNOLOGYCURRICULUM MATERIALS
Floyd L. McKinneyAlan Kohan
The National Center for Research in Vocational EducationThe Ohio State University
1960 Kenny RoadColumbus, Ohio 43210-1090
1986
Project Title:
Contract Number:
Project Number:
Act Under WhichFunds Administered:
Source of Contract:
Contractor:
Executive Director:
Disclaimer:
DiscriminationProhibited:
FUNDING INFORMATION
National Center for Research in VocationalEducation, Evaluation Function
300830016
0510050010
Education Amendments 1976, P.L. 94-482
Office of Vocational and Adult EducationU.S. Department of EducationWashington, DC 20202
The National Center for Research inVocational Education
The Ohio State UniversityColumbus, Ohio 43210-1090
Robert E. Taylor
This publication was prepared pursuant to acontract with the Office of Vocational andAdult Education, U.S. Department ofEducation. Contractors undertaking suchprojects under Government sponsorship areencouraged to express freely their judgmentin professional and technical matters.Points of view or opinions do not,therefore, necessarily represent officialU.S. Department of Education position ofpolicy.
Title VI of the Civil Rights Act of 1964states: No person in the United Statesshall, on the ground of race, color, ornational origin, be excluded fromparticipation in, be denied the benefitsof, or be subjected to discrimination underany education program or activity receivingFederal financial assistance." Title IX ofthe Education Amendments of 1972 states:"No person in the United States shall, onthe basis of sex, be excluded fromparticipation in, be denied the benefitsof, or be subjected to discrimination underany education program or activity receivingFederal financial assistance." Therefore,the National Center for Research inVocational Education Project, like everyprogram or activity receiving financialassistance from the U.S. Department ofEducation, must be operated in compliancewith these laws.
5ii
TABLE OF CONTENTS
LIST OF TABLES AND FIGURES v
FOREWORD vi
EXECUTIVE SUMMARY vii
CHAPTER 1. INTRODUCTION 1
Objectives 2
Principles of Technology 2
CHAPTER 2. FORMATIVE EVALUATION FOR CURRICULUMDEVELOPMENT 5
A Classification of Ideal Information Needs . . . . 5
Selected Alternative Approaches to FormativeEvaluation 5
Summary 12
CHAPTER 3. FORMATIVE EVALUATION FOR PRINCIPLES OFTECHNOLOGY CURRICULUM DEVELOPMENT EFFORT . . 13
Principles of Technology Curriculum Ove-view . . . . 13First-Year Formative Evaluation Effort 15
Essential Features of Principles of TechnologyCurriculum Effort 16
Constraints on Formative Evaluation DesignPotential 18
Evaluation Standards 21Assessment of Alternative Formative Evaluation
Approaches 24Selected Design Elements to Strengthen Current
Evaluation Efforts 26
CHAPTER 4. ASSESSMENT OF IMPLEMENTATION OF ESSENTIALEVALUATIO! DESIGN FEATURES 29
Assessment Design 29Data Collection 33Assessment Summary 33
CHAPTER 5. CONCLUSIONS AND RECOMMENDATIONS 39
Assessment Conclusions 39
Recommendations for Formative Evaluation ofCurriculum Efforts 40
iii
APPENDIX A.
APPENDIX B.
APPENDIX C.
APPENDIX D.
CONTENT REVIEW TEAM
FIRST-YEAR EVALUATION INSTRUMENTS
FORMATIVE EVALUATION TIME LINE
THE NATIONAL CENTER SUGGESTED APPROACHA:3R THE SECOND-YEAR FORMATIVEEVALUATION
43
47
77
81
APPENDIX E. SECOND-YEAR INSTRUMENTS USED BYAIT/CORD 119
APPENDIX F. ASSESSMENT INTERVIEW GUIDE 131
REFERENCES 137
iv
7
LIST OF TABLES AND FIGURES
Table
1 A CLASSIFICATION OF INFORMATION NEEDS INFORMATIVE EVALUATION 9
2 PRINCIPLES OF TECHNOLOGY EVALUATION PLAN . . 16
3 ASSESSMENT OF INFORMATION NEEDS IN RELATIONTO EVALUATION STANDARDS 24
4 ASSESSMENT OF ALTERNATIVE FORMATIVE EVALUATIONAPPROACHES AGAINST EVALUATION STANDARDS. . . . 27
Figure
1. Conceptual model of four-stage formativeevaluation framework 9
2. Flowchart of four-stage formativeevaluation 10
3. Principles of Technology instructionalunit sequence 13
4. Schematic of a typical instructional unitteaching plan 14
5. Sources of informLtion about the feasibility,utility, accuracy, and proprietarycharacteristics of the formative evaluationprocesses 30
FOREWORD
Rapidly accelerating technological advances in all phasesof our lives, but especially in the workplace, have created anincreasing need for vocational education students to understandfundamental principles about the work they may pursue. Thedevelopment of curriculum materials for teachers and students isan essential step in the process of assuring that students acquirean understanding of technological principles.
The Center for Occupational Research (CORD), Waco, Texas,and the Agency for Instructional Technology (AIT), Bloomington,Indiana, have been working with a consortium of State agencies forvocational education to develop curriculum materials dealing withthe principles of technology as they apply to the mechanical,fluidal, thermal, and electrical energy systems of modern equip-ment. CORD and AIT in conjunction with the participating Statesdesigned formative evaluation processes to be used during the2-year effort. The National Center for Research in VocationalEducation became involved with the Principles of Technology effortduring the last part of the first year of implementation of thecurriculum materials. Given this entry point, the role of theNational Center has been one of reviewing the already operation-ali%ed formative evaluation design and making suggestions forimproving the formative evaluation.
The National Center expresses its appreciation to the CORDand AIT staff members who were most cooperative in this effort.Evaluation managers at AIT, Bill Johnston and Jim Shea wereespecially helpful. Dr. Tim Wentling, Professor, University ofIllinois, and Phillip Rollain, Project Coordinator, North CarolinaDepartment of Public Instruction, served as panel members toreview the National Center formative evaluation suggestions. TheNational Center is also appreciative of the reviews provided byDr. John Washburn, Manager, Research and Development, IllinoisDepartment of Adult Vocational-Technical Education.
The National Center is indebted to the staff who worked onthis task. The study was conducted in the Evaluation and PolicyDivision, Dr. N.L. McCaslin, As .,ociate Director. Dr. Floyd L.McKinney, Senior Research Specialist, served as project director,and Alan Rohan served as Graduate Research Associate.
Funding for the study was provided by the Office of Voca-tional and Adult Education, U.S. Department of Education.
Robert E. TaylorExecutive DirectorThe National Center for Research
in Vocational Education
vii
EXECUTIVE SUMMARY
The focus of this task was to improve the formative evalu-ation design for the Principles of Technology (PT) curriculummaterials. The Principles of Technology curriculum materials weredesigned to prepare secondary students for technical careers inthe complex and rapidly changing technological field. The cur-riculum materials were developed by staff of the Center for Occu-pational Research and Development and the Agency for InstructionalTechnology working with a consortium of 35 State and Canadianvocational education agencies. The curriculum is comprised of 14units, with each unit dealing with one principle of technology asit applies to the mechanical, fluidal, thermal, and electricalenergy systems of modern equipment.
This task was a 12-month effort that began on January 16,1985. Thus, National Center staff entered the Principles ofTechnology 2-year formative evaluation effort at approximately themidpoint of the PT evaluation time line. The formative evaluationdesign for units 1-7 was developed and implemented prior to theentry of the National Center in the formative evaluation effort.Given this set of circumstances, National Center staff reviewedthe formative evaluation activities already being conducted byAIT/CORD and made suggestions for improving the evaluation effort.Minor changes were suggested in the formative evaluation instru-ments used by teachers and students. The major suggestion forimproving the formative evaluation was the use of case studies.The case studies provided the curriculum developers with greaterdepth of information about the enormously complex setti.ng in whichlocal schools were attempting to implement PT.
The following recommendations are based on staff involvementwith the design for the formative evaluation of the Principles ofTechnology curriculum effort, previous evaluation efforts, andreviews of relevant literature:
o The designing of a formative evaluation effort should bedone concurrently with the development of the curriculumand the development of strategies for field testing thecurriculum.
o The purpose and nature of formative evaluation should bethoroughly explained for all concerned parties so thatexpectations for results do not overburden or undercut theevaluation effort.
o Extensive use of the telephone or other means that permittwo-way communication should be made by the curriculumdeveloper/evaluator and the curriculum user (teachers.students). However, this should not be construed as anacceptable substitute for on-site, in-depth observations,interviews, and document or record reviews.
ix
iu
CHAPTER 1
INTRODUCTION
This report describes the involvement of the National Centerfor Research in Vocational Education with the Center for Occupa-tional Research and Development (CORD), Waco, Texas, and theAgency for Instructional Technology (AIT), Bloomington, Indiana,concerning the formative evaluation of the Principles of Tech-nology curriculum materials. In this chapter a brief overview ofthe Principles of Technology effort is presented so that thereader will have a better understanding of the National Center'sinvolvement in the formative evaluation.
The Principles of Technology curriculum is designed to pre-pare students for technical careers in the complex and rapidlychanging technological field. Based on the Unified TechnicalConcepts (UTC) course developed by CORD, the curriculum is com-prised of 14 units, each addressing one principle of technology asit applies to the mechanical, fluidal, thermal, and electricalenergy systems of modern equipment. Because the design and opera-tion of modern equipment undergo constant change, technicians, ifthey are to adapt to the changing workplace, need to know thefundamental principles on which such equipment operates. Then,once technicians understand the principles on which their work isbased, they can apply these principles to new work situations asthe need develops.
The development of the Principles of Technology curriculumwas a cooperative effort comprised of 35 State and provincialvocational education agencies in association with AIT and CORD.Inherent to the development process for the Principles of Tech-nology curriculum was the assumption that this process is based onapproximate and imperfect knowledge. Therefore, verification ofthe curriculum during its development process was essential toensure that the final product was internally consistent and effec-tive. However, internal and external constraints necessitatedcompromises between ideal evaluation procedures on one hand andfeasibility on the other. The problem was to design a formativeevaluation for the Principles of Technology Curriculum that couldbe accomplished with a relatively small investment of time;provided specific revision feedback about the effectiveness of theoverall learning system, including the printed materials, labactivities, learning kits, displays, and audiovisual materials;did not confuse or frustrate trainees during their coursework; andprovided quick results.
1
11
Objectives
The National Center had two task objectives:
o To design a formative evaluation for the ?rinciples ofTechnology curriculum materials
o To assess the implementation of the evaluation design
The target audience for this task was the AIT evaluationstaff. The intended use of the formative evaluation suggestionsgenerated by the task staff was for improvement of tne formativeevaluation of the PT curriculum materials.
Principles of Technology
History
The content of the Principles of Technology curriculum wasbased on the UTC instructional model developed about 7 years agoby CORD for use at the postsecondary, associate degree level.Perhaps the most interesting aspect of the UTC curriculum was themanner in which the key elements of technology--electrical,mechanical, fluidal, and thermal principles--were presented.Rather than being presented as isolated concepts, the key elementsof technology in the UTC curriculum were presented together asanalogous phenomena. Moreover, the UTC curriculum was orientedtoward preparing students for high-technology occupations byproviding a broad-based background in applied physics. Industrialapplications of the UTC principles were emphasized throughout thecurriculum.
The need for a similar system of instruction at the secondarylevel is being acknowledged by state departments of education.The increased requirements in math and science for graduation fromhigh school have made it more difficult for high school studentsto complete a traditional vocational education program. Further-more, the increasing emphasis on technological literacy and asomewhat decreasing emphasis on specific job-skill training at thesecondary level have added legitimate pressures for changing thefocus of secondary vocational education programs. By April 1983,the increased interest in having the UTC curriculum adapted forimplementation at the secondary level led to the formation of aconsortium of about 30 States to initiate development of such acurriculum. From an economic standpoint, the consortium, inassociation with CORD and AIT, was a feasible strategy for theinitial development of the Principles of Technology curriculum.The prosr:ctus for the project was issued in June 1983, and the
2
12
initial development work began in November 1983. Currently, theconsortium membership is comprised of a total of 35 States andCanadian provinces and represents an investment of approximately$2.5 million for the development of the Principles of Technologycurriculum.
guil7view of Initial Formative Evaluation_Process
In cooperation with CORD, AIT was responsible for ale forma-tive evaluation of the Principles of Technology curriculum. CORDwas responsible for developing the unit objectives ano printedmaterials. Video programs were developed by AIT. An outsidereview team, comprised of eight members (appendix A), reviewed allmaterials before pilot testing. Consortium agencies also receivedcurriculum materials for review prior to the pilot tests.
An important part of the developmental process was a pilottest of each instructional unit in actual classroom settings. Theprimary purposes of the pilot test were to determine how well thematerials were working and to identify specific problems with thematerials. Each consortium agency was allotted two pilot-testsites. Test sites and teachers, however, were selected by therespective States and provinces. AIT had no control over the testsite selection criteria.
Pilot-test evaluation materials were comprised of a pretestand a posttest, a student attitude questionnaire, and a detailedteacher questionnaire (appendix B). In brief, teachers adminis-tered the pretest prior to teaching the unit. While the unit wasbeing taught, teachers recorded their reactions to the unit on thedetailed teacher questionnaire. At the conclusion of the unit,teachers administered the posttest and student attitude question-naire. The pretest, posttest, student questionnaire, and teacherquestionnaire were scored and analyzed by AIT staff.
In May 1984, a 1-week tryout of the pilot-test procedures wasconducted a.; a vocational school in Tulsa, Oklah_;da. A prelimi-nary version of the materials for the subunit "Force in MechanicalSystems" was used in the tryout. One teacher and nine studentsparticipated in the tryout. Observers were present during the 5days of instruction. Overall, AIT staff concluded that theOklahoma tryout was useful in identifying positive indicationsabout the curriculum and evaluation procedures as well as inidentifying some legitimate concerns.
In June 1984, a Principles of Technology teacher orientationmeeting was held in Dallas, Texas. Approximately 60 teachersparticipated. The primary purpose of the workshop was to orient
3
13
pilot-test teachers to the Principles of Technology curriculum andits instructional plan and to explain formative evaluation datacollection activities and procedures for the upcoming pilot-testyear.
Instructional units 1-7 were pilot-tested from September 1984tc June 1985. Of the original 60 pilot-test teachers, about 30teachers completed all of the evaluation forms during the firstpilot-test year. Although there was a variety of reasons why testsite schools did not fully participate, the overall reasonppeared to be a problem in obtaining the necessary instructionalequipment and/or school funds to purchase program start-upequilAent. Nevertheless, approximately 400-600 studentsparticipated in the first-year pilot test.
An ove:view of the project time line for the curriculummaterial production and development is presented in appendix C.Instructional units 1-7 were released in their final form duringthe period July-November 1985. The pilot-testing of units 8-14began in September 1985, and they were scheduled for final releasein September 1986.
4
14
CHAPTER 2
FORMATIVE EVALUATION FOR CURRICULUM DEVELOPMENT
Two essential activities in considering suggestions forimproving the formative evaluation of the Principles of Technologycurriculum materials were the following: (1) identifying relevantinformation needs and (2) reviewing alternative approaches thatmight have desirable features for consideration. The followingsections describe task efforts for these two activities.
A Classification of Ideal Information Needs
Sanders and Cunningham (1973) categorized three major sourcesof information as being essential for the formative evaluation ofeducational materials. (See table 1.) Although the categorieswere not mutually exclusive or exhaustive, Sanders and Cunninghamnoted that the important point was not the classification schemein and of itself, but that no source of information should beoverlooked, given project resources and constraints.
Selected Alternative_Auroaches to Formative Evaluation
According to Dick and Carey (1978),
formative evaluation is the process used to obtain datafor instructors to use to increase the efficiency andeffectiveness of their instructional materials. Theemphasis in formative evaluation is on the collection ofdata in order to revise the individual materials, tomake the materials as effective as possible. (p. 159)
Dick and Carey noted that when the instructional materials are intheir final form, "other people may collect data to determinewhether the materials should be used in a particular setting orwhether they are effective as claimed" (p. 159).
Sarapin (1981) noted that the foundation for formative evalu-ation for curriculum development and improvement was based uponealuation methodology reported by Scriven (1967), Stake (1967),..,cufflebeam (1967, 1969), Stufflebeam et al. (1971), and Provus(1969). Several scholars (Abedor 1971; Sanders and Cunningham1973; Baker and Alkin 1973; Dick and Carey 1978, 1985; and Sarapin1981) have conceptuilized formative evaluation models and pro-cedures. From these conceptualizations, various approaches toformative evaluation have emerged. Brief descriptions of theseapproaches that were identified as being pertinent to the Princi-ples of Technology project are highlighted in the following sub-sections.
5
15
TABLE 1
A CLASSIFICATION OF INFORMATION NEEDS INFORMATIVE EVALUATION
I. Internal information
A. Descriptive information
1. Physical specification2. Rationale, goals, and objectives3. Content4. Other
B. Critical appraisal
1. Author (developer)2. Experts (subject matter, media, psychologists, and
so forth)3- Students using the materials4. Teachers using the materials5. Relevant others
II. External information
A. Assessment of the effects of the materials on studentbehavior
1. Achievement2. Attitude3. Skill4. Interest5. Commitment6. Other
B. Assessment of the effects of the materials orbehavior
1. Attitude2. Interest3. Commitment4. Competency5. Teaching strategy6. Other
6
16
TABLE 1--Continued
C. Assessment of the effects of the materials on thebehavior of relevant others
1. Patents2. Administrators3. Teachers not using the materials4. Students not using the materials5. The community6. Other
III. Contextual information
A. Student characteristicsB. Teacher characteristicsC. School characteristicsD. Community characteristicsE. Curricular characteristicsF. Other relevant elements in the learning environment
SOURCE: Sanders and Cunningham (1973, p. 218).
Clinical Approach (One-to-One)
Lowe, Thurston, and Brown (1983) described a one-to-oneevaluation process they used in a curriculum development projectfor the Kingdom of Saudi Arabia. Because of their project's needsand constraints, they selected a learner verification approach asthe most appropriate. They found that the clinical procedure wasefficient, cost-effective, and reliable.
The clinical approach to learner verification involved work-ing with one student at a time. Each student, as he or she workedthrough a complete learning sequence, was closely observed by anexperienced evaluator and subject matter expert. When a studentencountered difficulty, the evaluator questioned the student untilthe source of difficulty was identified. The subject-matterexpert assisted in equipment setup and was responsible for studentassessment on the learning objectives. Incorrect answers to testitems were discussed with the student so that a determinationcould be made as to why the student chose the incorrect answer.
The authors noted that at the end of the evaluation session,the evaluator debriefed the student by asking a series of ques-tions about the material and its effectiveness. Once the studentresponses and comments of the evaluator and subject-matter expertswere summaLized, revisions were made by the development team. Asecond one-to-one evaluation session was conducted. The authors
7
1?
noted that additional evaluation sessions were not found to becost-effective or time-effective.
Small-Group Trial
In the formative evaluation sequence proposed by Dick andCarey (1978), small-group trials follow one-to-one evaluations.After the materials have been revised on the basis of the one-to-one evaluation? the typical procedure is to select 10-20 studentsto participate in a small -group evaluation of the revised mate-rials. Usually, the materials are administered in the manner inwhich they are expected to be used when they are in their finalform. Pretests are administered before instruction begins. Afterthe students complete the materials, they take a posttest. Ac-cording to Dick and Carey, the instructor does not interveneduring the instructional process unless equipment or instructionalproblems prevent the student(s) from advancing through the mate-rials. The problems encountered by the students and how theproblems were solved should be noted in the evaluation data. Dickand Carey noted that student attitude questionnaires and discus-sions with students are appropriate during the small group trial.
Recent research found that it might be possible to eliminateeither the one-to-one or small-group stage in the formative eval-uation sequence proposed by Dick and Carey (1978) and still main-tain effective revision of the instructional materials (Randaswamyet al. 1976; Carey 1978; Wager 1983). Although these studies hadlimitations, their results indicated that formative evaluatorsmight save time and financial resources by eliminating either thesmall-group step or the one-to-one step from their evaluation planwithout sacrificing effectiveness.
Local Trial and Field Test
Bastion et al. (1983) described a formative evaluationapproach that combined the features of the one-to-one and small-group trials into one evaluation stage that they referred to aslocal trial. The second stage of their evaluation approach re-flected those procedures of a field test described by Dick andCarey (1978).
During the local trial, a draft of the curriculum materialsis submitted to a subject-matter expert who is not a member of thecurriculum development team. The subject-matter expert reviewsthe materials for content and recommends revisions. In addition,a small-group trial is conducted with typical learners. Studentsare pretested, posttested, given an attitude questionnaire, andencouraged to write comments directly on the materials.
8
S
During the field-test stage, the curriculum materials aretested in a variety of settings under conditions similar to reg-ular course instruction. School-site staff and teachers admin-ister the curriculum materials, pretests and posttests, and cur-riculum evaluation questionnaires. The curriculum developmentteam conducts interviews with school-site administrators, teach-ers, and students to obtain additional formative evaluationinformation.
Four-Stage Foxmative Evaliuti211
Perhaps the most ideal formative evaluation approach forfacilitating instructional material development, assessment, andrevision is the approach described by Sanders and Cunningham(1973) and Sarapin (1961). Sarapin developed a symbolic analogueof the framework proposed by Sanders and Cunningham to simplifythe essential elements of a comprehensive formative evaluationeffort. This symbolic analogue, or conceptual model, is presentedin figure 1.
Identify &Order Goals
1.0
Identify &Operationalize
Objectives2.0
DevelopInterim
Materials3.0
FieldTest
Products4.0
Evaluation Evaluation Evaluation Evaluationof of of of
Goals Objectives Interim ProductsMaterials
Judgments, Revisions, and Feedback
SOURCE Sarapin, M , 1981, p. 6.
Figure 1. Conceptual model of four-stage formative evaluation framework
9
19
At the procedural level, Sarapin developed a formative eval-uation flowchart, shown in figure 2, that made the conceptualmodel operational. The flowchart, according to Sarapin, is aheuri8tic device that should be adopted as needed for each cur-riculum formative evaluation effort. A detailed description ofeach of the four stages of this model was provided by Sarapin(1981).
/ enterformative/sequence
no
T
es
ei
10yes
20es
Identi &fyOrder Goals
1 0
Identify &Operations lize
Objectives20
DevelopInterim
Materials30
Field TestPro.fucts
4.0
developmentneeded
11
Yes
prior.tie:thanTed
no
obtectiveschanged
31
no
maprevisiomneeded
41no
IV
Ynyes
,dentfgoals
12
paoie*lectern
23
Bevel )pprototypematerials
32
intrinsictryout 42)
Iordergoals
1 3
identifyIves
22
edit reviseprototype
33
pay offtryout
43
II
of\ obeectrves
24
evaluationof
goals14
evaluationof interimmaterials products
-44
productsemerge
45
ponorMeSemerge
15
Criteriaemerge
25
34
intmmaterialsemerge
SOURCE Sarapia, M , 1981, p. 7
Figure 2. Flowchart of four-stage formative eval
Instructional Quality Inventory
Montague et al. (1983) noted that the InstructiInventory (IQI) is a formative evaluation process tsigned initially for military instructional developis an internal review process that "can be appliedtematically developed program of instruction thattest items and instruction tied to the goals" (Mon
1U
20
sequence
exit
uation
onal Qualityhat was de-ment. The IQIto any sys-has goals andtague et al.
1983, p. 11). The IQI process, in essence, checks to see if testitems are consistent with the instructional objectives. Once theobjectives and test items are consistent with each other, the nextstep is to ensure that the instructional materials are consistentwith the objectives and test items.
Montague et al. (1983) presented an overview of '.:he IQIprocedures. Objectives and test items were classified by a task-content matrix. The IQI consisted of five procedures that judgedthe adequacy of objectives, test items, and instructional mater-ials. The IQI has been found to be a cost-effective processcompared with other formative evaluation methods. The authorsnoted, however, "that the IQI does not address either adaptinginstruction to students or the sequenci g or structuring of con-tent" (p. 13).
Responsive Evaluation
Stake (1975) has described the process of conducting aresponsive evaluation.
To do a responsive evaluation, the evaluator conceivesof a plan of observation3 and negotiations. He arrangesfrr various persons to observe the program, and withtheir help prepares brief narratives, portrayals, pro-duct displays, graphs, etc. He finds out what is ofvalue to his audiences, and gathers expressions ofworth from various individuals whose points of viewdiffer. Of course, he checks the quality of hisrecords: he gets program personnel to react to theaccuracy of his portrayals; and audience members toreact to the relevance of his findings. He does mostof this informally--iterating and keeping a record ofaction and reaction. He chooses media accessible to hisaudiences to increase the likelihood and fidelity ofcommunication. He might prepare a final wri:ten report,he might not--depending on what he and his clients haveagreed on. (p. 14)
Responsive evaluation makes extensive use of interviews (sometimescalled conversations), observations, document reviews, and recordreviews in and around a given program. The concerns and issuesare gathered from the persons, records, and documents involved.In responsive evaluation, the evaluator determines how best tocollect data. The human instrument is most frequently used, butother means are also employed to collect needed data. Casestudies are frequently used in responsive evaluation situationsbecause of the concern for a specific "case" and because theyreadily lend themselves to data collection via interviews, obser-vations, document review, and record review.
11
21
Summary
A classification of ideal informaf-ion needs for formativeevaluation would show that no information should be overlooked,given consideration of project resources and constraints. Theemphasis of formative evaluation was described as the process ofcollecting data on instructional materials for the purpose ofrevising the materials to make them as effective as possible. Thefoundation for formative evaluation for curriculum development andimprovement was the evaluation methodology reported by numerousscholars. Although numerous formative evaluation approaches havebeen developed, the following six approaches were identified asbeing pertinent to the Principles of Technology formative evalu-ation effort: (1) clinical approach, (2) small-group trial,(3) local trial and field test, (4) four-stage formative,(5) instructional quality inventory, and (6) responsive evalu-ation. A brief description of each approach was provided. Anassessment of these six approaches, project constraints, essentialfeatures of the curriculum, and evaluation standards is includedin the following chapter.
12
22
CHAPTER 3
FORMATIVE EVALUATION FOR PRINCIPLES OFTECHNOLOGY CURRICULUM DEVELOPMENT EFFORT
Principles of Technology Curriculum Overview
Printed Teac
The Principles of Technology student text is systematicallydivided into 14 units. Each unit covers one technical concept andcontains an overview of the unit, subunits, and a summary. Eachsubunit deals with the unit's major technical concept as it ap-plies in one of the four energy systems, and each subunit haslecture materials, a suggested demonstration to use with thelecture portion, a math skills lab, and two hands-on physicsapplication labs. A summary of the important concepts concludesthe unit. A glossary of key words and activities for improvingmath skills are included in the unit's appendix. Finally, an end-of-unit exercise is provided to review the main ideas and defini-tions presented in the units.
The target audience for units 1-7 and units 8-14 are 11th and12th graders, respectively. Typically, the 14 units are presentedover 2 years and in the sequence delineated in figure 3. Further-more, 10th graders and students who are in the academic tracks may
First Year UnitsSuggested Sequence
SecondYear Units SecondYear UnitsNumber Sequence Optional Sequence
FORCE TRANSFORMERS I TIME CONSTRAINTS
tIPOWER
II OPTICAL SYSTEMS
f trENERGY
I IRADIATION
1 t[--- RESISTANCE
I ITRANSDUCERS
tIRATE
II ENERGY CONVERTORS
LIWORK -I I WAVES AND
tVIBRATIONS
tI FORCE .1 1----- MOMENTUM
SOURCE Center for Occupational Research and Development, 1985, p A-3
TIMECONSTRAIN1S
OPTICALSYSTEMS
tRADIATION TRANSDUCERS
NAVES ANDVI? .ATIONS
ENERGYCONVERTERS
MOMENTUM
1ST YEARUNITS
t
In the second year, Momentum must be taught firstThe order of instruction is then flexibible with someexceptions Reideltion must ya.wle Optical SystemsWaves and Vityretiom must precede Redastion EnergyConvertors must Precede Transducers
Figure 3. Principles of technology instructional unit sequence
13
23
find the curriculum useful and beneficial. However, studentsenrolled in Principles of Technology should be able to read atleast at the 8th-grade level and have satisfactorily completed 1year of high school general mathematics. Previous or concurrentenrollment in algebra is desirable.
The teaching plan for a typical unit is depicted in figure 4.Each rectangle represents 50 minutes of instruction. Most unitsrequire 26 class sessions. The first class for each unit is anintroduction and overview of the unit's content. The first twosessions of the subunit include the video presentation and lec-ture and discussions. The third session is a math skills lab.The next two sessions are hands-on physics application labs. Thelast class session of the subunit is a review of the material.The last class of the unit is a unit summary and test.
UNITOVERVIEWCLASS 1
SUBUNIT 1CLASS 2-7
SUBUNIT 2CLASS 8-13
SUBUNIT 3CLASS 14.19
SUBUNIT 4CLASS 20-25
UNITSUMMARY
0
H
1
C1
C1
LiJS
C2
C2
C2
M
M
M
M
L1
L1
L1
L2
L2
L2
L2
R
R
R
R
MECHANICALSYSTEMS
FLUIDSYSTEMS
ELECTRICALSYSTEMS
THERMALSYSTEMS
Figure 4. Schematic of a typical instructional unit teaching plan
For each instructional unit there is a teacher's guide, whichprovides suggestions for teaching each class session. The guideis not a set of rigid rules or a substitute for teacher ingenuity,but it is a useful tool for implementing the Principles of Tech-nology course.
14
24
Video
There are about 78 video programs that total approximately500 minutes infused throughout units 1-14. The video segmentsintroduce ideas presented in the text. Through the video programstudents are taken to the workplace settings where technicians areemployed. The video segments were designed as a tool for puttingvariety into the classroom teaching pattern.
Equipment and Facilities
A standard-size classroom or laboratory space sufficient toaccommodate one to five laboratory stations per class is needed toimplement the Principles of Technology units. Depending uponclass size and availability C: equipment, lab space must be flex-ible to accommodate a variety of lab station activities. A video-cassette player (1/2" Beta, 1" VHS, or 3/4" U-matic) and a colormonitor are needed for the video segments. The cost of equippingeach lab station for the first year is approximately $4,000-$6,000. Demonstration and laboratory equipment may be locallyavailable, may need to be fabricated, or may need to be orderedfrom commercial vendors. Laboratory and equipment specificationsare delineated in the teacher's guide.
First-Year Formative Evaluation Effort
The National Center designated study Design and Assessment ofthe Formative Evaluation of Principles of Technology CurriculumMaterials had a 12-month contract that commenced on January 16,1985. Thus, National Center project staff entered the Principlesof Technology 2-year formative evaluation effort at approximatelythe midpoint of the AIT evaluation time line. In fact, the evalu-ation design for units 1-7 was developed and implemented prior tothe entry of the National Center into the formative evaluationeffort.
The first-year evaluation plan, as developed by AIT, isdepicted in table 2. The entire data collection instrument pack-age used by AIT is presented in appendix B.
Descriptive statistics were used to analyze the data from thestudent and teacher questionnaires. Responies from r?en-endedquestions were summarized and evaluated by AIT project staff as totheir merit for improving the curriculum materials. Studentpretest and posttest scores were subjected to analysis of covari-ance that controlled the effects of student sex and grade level,teacher background, and teaching pattern.
15
23
1
Essential Features ofPrinciples f Technology Curriculum Effort
One of the initial steps in the design of the formativeevaluation by National Center staff was the identification ofthose features considered essential and unique to the Principles
of Technology curriculum. The essential features, together withthe likely constraints and the evaluation standards established bythe Joint Committee on Standards for Education Evaluation (util-ity, feasibility, propriety, and accuracy), served to guide thetask staff in analyzing alternative formative evaluation designs.Selecting suggestions for strengthening the formative evaluationby National Center staff was essentially a critical judgmentalprocess involving trade-offs among the essential features andevaluation standards. The essential features of the Principles ofTechnology curriculum effort, identified by National Center staff,
are discussed in the following sections.
TABLE 2
PRINCIPLES OF TECHNOLOGY EVALUATION PLAN
Project
Materials Date Source Assesenent Focus Instrument When
Video/Student
Handbook/Labs
Students Overall Instruc-
Monet Impact
Pre/posttest
Questionnaire
I subunit
for each
unit and
I unit
ALL Teachers Manageability of
components, clarity
of materials, appro-
prieteness for tar-
get audience, per-
ceived educational
worth
Questionnaire
and structured
log
At con-
clusion
of each
unit
All Consortium/Con-
sortium Review
Team
Clarity, appropri-
eteness, materials
link
None for re-
view team
Questionnaires
for full con-
sortium
As each
unit is
developed
le2G
Instructional . erials
The comprehensive student text, video, and teacher's guideare essential for the successful implementation of the Principlesof Technology curriculum. Therefore, these materials form a totalinstructional package that needs to be included in the iormativeevaluation. For example, do the video and printed materialscorrelate in terms of textual and graphic illustration? Is theterminology consistent throughout the instructional materials?
Equipment and Facilities
The application laboratories are essential for successfulimplementation of the curriculum. Lab space must be flexible toaccommodate a variety of lab station activities. A videocassetteplayer and a color monitor are required. Specifically, was con-sideration given to equipment cost, utility, adaptability, andavailability? Are laboratory and equipment setup specificationsprovided? If special equipment needs to oe fabricated, are designspecifications adequately described?
Teacher Background
Teachers from the existing faculty who have an interest invocational students should teach Principles of Technology. Theseteachers should be familiar wit'', or willing and able to becomefamiliar with the physics and mathematics in the course. Teamteaching may be required if qualified teachers are not available.
Student Background
The Principles of Technology curriculum is designed for llth-and 12th-grade students who are interested in technical careers.Tenth-grade and nonvocational students also may want to enroll,and they should be accommodated. Students should have at least aneighth-grade reading level and should have had at least 1 year ofgeneral mathematics. Previous or concurre:t enrollment in algebrawould be helpful.
Although the following features are not unique to the cur-riculum, they are essential for successful implementation.
School Counselor Cooperation
School counselors should be knowledgeable of the purpose,content, and prerequisites of the Principles of Technologycurriculum so that proper counseling can be given to potentialstudents and to currently enrolled students.
17
21
School Administrator Suppott
School-site administrative support and leadership are essen-tial to the successful implementation of Principles of Technology.School-site administrators play a pivotal role in eliciting thesupport of the school board, the community, and the school'sfaculty and staff.
Cpmmunity Support
Parents, business and industry, and concerned citizens should'e knowledgeable of the purpose and general content of the Princi-ples of Technology curriculum so that meaningful community cooper-ation can be elicited in the implementation effort.
School Characteristics
Individual school culture and climate would affect implemen-tation of the Principles of Technology curriculum, as would thesize of the school, students' and teachers' backgrounds and theirrelationships with each other, their attitudes toward new curricu-lum and toward schooling in general, school-site leadership, andthe socioeconomic status of the student population.
Information Activities
A well-planned public relations effort is essential foreliciting school and community support for the Principles of Tech-nology curriculum. Information activities are needed at theNational, State, and local levels to inform potential students,counselors, teachers, school administrators, and community andbusiness leaders about the new curriculum.
Constraints on Formative_avriluataftentaal-Numerous constraints on the Principles of Technology forma-
tive evaluation effort were iuentified by National Center and AITstaff. Although these constraints did not make the evaluationimpossible, they did alert the evaluators to potential problemsand limitations that needed to be addressed in the formativeevaluation design. The following sections highlight the moreserious constraints on the Principles of Technology formativeevaluation.
18
28
Insufficient Resources
Two obvious and uncontrollable factors that limit the Prin-ciples of Technology formative evaluation effort are time andfinancial resource constraints. Given pilot-test time lines ofSeptember 1984 through June 1986 for units 1-7 and September1985 through June 1986 for units 8-14, time became a severe re-striction on the formative evaluation effort. Moreover, NationalCenter staff were limited by time in what they could propose as auseful and feasible formative evaluation supplemental design, andtheir project budget constraints precluded them from conductingevaluation activities that they proposed. Likewise, the AITproject staff members were limited by their budget on the numberand type of additional evaluation activities they could undertake.In summation, given time and budget constraints, the developmentof an effective, efficient, and feasible formative evaluationdesign that would enhance current Principles of Technology evalu-ation activities presented a challenge to all involved.
Data Collection Limitations
The rights and welfare of human subjects participating in theformative evaluation needed to be respected and protected. Moral,ethical, and legal codes needed to be followed. To ensure thatinformation was not obtained illegally or unethically, data col-lection efforts to conform to the rights of human subjects.This reality imposed limitations on "what" and "how" informationwas collected.
External Constral..ts
A number of persistent external constraints affected theproject.
Equipment problems. Test site variability in the successcr raining correct laboratory equipment and proper equipmentsetup was an uncontrollable factor that affected test site evalua-ticn results.
Lack of demographic contextual data. Student and ceacherbackground data may not be representative of the target popula-tion. Information about school climate and culture and aboutstudents' and teachers' perceptions of the curriculum and class-room climate was lacking. The lack of demographic/contextual dataimposed restrictions on interpretation of evaluation results.
19
2J
Variability in teaching pattern. Variability in length ofclassroom instruction, number of classes per unit, number and typeof homework assignments, teacher enthusiasm, and out of classa,sistance constituted uncontrollable factors that affected evalu-
ation results.
First-Year Evaluation Design
In essence, .che first-year evaluation design was a field testthat utilized a one-group pretes.-posttest design as a means ofmeasuring criteria performance. Threats to the internal validityof this design were numerous. No control groups were utilized;therefore, the following threats to validity imposed seriousconstraints on the interpretation of posttest results.
History. For examp:s, during the time of a particular unit
of instruction, did students receive any outside learning assist-
ance from relatives, tutors, TV, friends, and so forth? If so, it
would be difficult to ;nterpret posttest scores accurately. Did
the instructional unit account for the difference between pre- andposttest scores, or did the outside help account for the differ-ence?
Testing. Since the pre- and posttest instruments were iden-
tical, there may be a practice effect or pretest sensitization.However, this threat was addressed by minimizing memory effects.
Instrumentaag.n. Perhaps the most serious threat to thevalidity of the results pertained to the validity and reliabilityof the pre- and posttests. Interpretation of the posttest resultsmust consider the lack of validity and reliability information forthe measurement instruments.
Regression. This is a subtle phenomenon that needs to beconsidered when interpreting posttest results. Regressions sug-
gests that students who scored low on the pretest will tend toregress toward the mean on the posttest. Conversely, students whos ..ored high on the pretest will tend to regress toward the mean onthe posttest.
Helection. Neither students nor classrooms were randomlyselected. Test site selections appeared to have been volunteeredby respective State staff members. In addition to threats tointernal validity, there were external validity threats to theone-group pretest- posttest design.
Bawthorne_effect. Since the students knew that they wereparticipants in an evaluation, they may have tried harder thanthey normally would have to learn the material. Thus, theirposttest scores may have increased as a result of this special
attention.
20
Novelty effect. The newness of the curricula may excitestudent interest. This effect is similar to the Hawthorne effect.
No control group. A lack of a usable comparison group re-stricted interpretation of posttest results. Perhaps the samestudents in a technical physics class taught by a physics teacherwould obtain the same or better results on the posttest.
Lack of random assignment. Students or teachers were notrandomly assigned to treatments. The generalization of results tostudents and teachers who did not participate in the evaluation isnot valid.
Although the aforementioned threats to internal and externalvalidity pertain to behavioral research in general and not toformative evaluation pilot testing per se, these threats should beconsidered when interpreting unit posttest scores. In addition,it is impossible or impractical to deal with many of these prob-lems in the educational setting.
Evaluation Standards
The evaluation standards established by the Joint Committeeon Standards for Educational Evaluation (1981) served to guideNational Center task staff in analyzing alternative formativeevaluation designs. The Joint Committee identified four importantattributes of an evaluation: utility, feasibility, propriety, andaccuracy, and stated, "The Committee is satisfied that standardswhich shape an evaluation so that it has these four characteris-tics are necessary and sufficient for sound evaluation in educa-tion" (p. 13). The four standards, as they applied to formativeevaluation, are described in the following subsections.
Utility
Information scope and selection. The information collectedby the formative evaluation should be sufficient to support ajudgment of worth and merit and should answer all relevant forma-tive evaluation questions.
Clarity of informati2n. The information collected shouldrespond clearly and without generalities to the formative eval-uation objectives, provide a firm foundation for formative evalu-ation conclusions and recommendations, and be characterized byconciseness and logical development so that it communicated to theformative evaluation audiences.
21
31
Timeliness. The formative evaluation should provide infor-mation to the curriculum developers at the time when the infor-mation can best be used. The most critical information needs ofthe curriculum developer must be met on time to avoid delays incurriculum revision decisions.
Feasibility
practical procedures. The formative evaluation proceduressuggested by National Center staff should be realistic and prac-tical, given the time and financial constraints inherent in theformative evaluation project. School and classroom disruptionsshould be kept to a minimum.
Political viability. The formative evaluation proceduresshould be politically viable given the various interest groups andstakeholders in the Principles of Technology curriculum project.In essence, the formative evaluation procedures are politicallyviable if their purpose is achieved despite pressures created bythe formal and informal organizational power structures of theState and local levels.
Cost-effectiveness. The evaluation has to produce informa-tion of sufficient value to justify its expense. The question ofwhether there are alternative evaluation approaches that wouldproduce more useful information at the same or less cost needs tobe addressed. The information needed should be obtained as eco-nomically as possible.
Propriety
Rights of human subjects. The formative evaluation shouldprotect the rights and welfare of human subjects. Legal, ethical,and common sense considerations should be followed.
Conflicts of interest. The formative evaluation should takeinto consideration the fact that conflicts of interest may ariseduring the formative evaluation effort. Strategies should bedeveloped for dealing with conflicts so that the evaluation pro-cess and results will not be compromised.
Balanced reporting. The formative evaluation should becomplete and fair in its presentation of strengths and weaknesses,and both negative and positive aspects of the curriculum evalu-ation should be reported. Strengths and weaknesses should not bemanipulated to please partisan interest groups. All evaluationfindings, even those that might prove embarrassing to partisaninterest groups, should be included in pilot-test findings.
22
32
Accuracy
Clearly identified object. The object of the formativeevaluation should be clearly identified and described realis-tically. Unique features of the object should be identified. Insummation, the descriptions and unique features of the objectshould be a valid characterization of the object.
Appropriate information sources. The formative evaluationmust provide adequate information to the curriculum developers.Multiple information sources should be utilized. Informationsources should be tapped using a variety of methods, such asinterviews, surveys, observations, and document reviews. Inessence, the information sources should answer the evaluativequestions.
Adeauacy of contextual information. Contextual informationshould be used in the interpretation of formative evaluationresults. For example, the curriculum developers should knowwhether a particular unit's success or failure was influenced bystudent socioeconomic status, teacher background, student academicbackground, school climate, teacher and administrator support orresistance toward the curriculum, and/or community support orapathy toward the curriculum.
Purposes -nd procedures explicated. Purposes and proceduresof the formative evaluation must be understood by those involvedin the formative evaluation effort. Specifically, the objectivesof the evaluation need to be clear. Procedures on how informationwas collected, organized, analyzed, and reported should be de-scribed in sufficient detail so that other evaluators couldconduct a similar evaluation in other settings.
Propriety rights of human subjects. The rights and welfareof human subjects need to be respected and protected. Legal,ethical, and common sense considerations need to be followed.
Conflicts of interest. Conflicts of interest should beavoided. If conflicts of interest arise, they should be dealtwith so that the evaluation process and results are not com-promised.
balanced reporting. The formative evaluation should becomplete and fair in its presentation of strengths and weaknesses.Both negative and positive aspects of the curriculum evaluationshould be reported. Strengths and weaknesses should not bemanipulated to please partisan interest groups. In fact, evalua-tion findings that might prove embarrassing to partisan interestgroups should not be omitted from evaluation findings.
23
33
Assessment of Alternative Formative Evaluation Approaches
The essential features of the Principles of Technology cur-riculum, together with the constraints and the evaluation stand-ards established by the Joint Commission, served as a guide forassessing the formative evaluation approaches identified byNational Center staff. I essence, the assessment was a criticaljudgment process that involved trade-offs among the essentialfeatures, information needs, constraints, and evaluation stan-dards. Suggestions for strengthening the first-year Principles ofTechnology formative evaluation effort emerged from this process.
The results of the assessment of information needs in rela-tion to evaluation standards are delineated in table 3. Theinformation needs specified in table 3 are identical to thoselisted in table 1. Ideally, the Principles of Technology forma-tive evaluation design should satisfy the information needs list-ed. However, project constraints limited the number and type ofdata collection activities that could be conducted.
TABLE 3
ASSESSMENT CF INFORMATION NEEDS IN RELATION TOEVALUATION STANDARDS
Evaluation DesignInformation Needs
Evaluation Standards
Util-ity
Feasi-bility
Pro-priety
Accu-racy
T. Internal informationA. Descriptive
1. Physical speci-fications
2. Rationale, goals,objectives
3 Content
Y Y Y Y
B. Critical Appraisal1. Developer2. Subject matter
experts3. Students using the
material4. Teachers using the
materials
Y Y Y
24
34
TABLE 3-- Continued
Evaluation DesignInformation Needs
Util-ity
II. External InformationA. Interim Effects of t1i
Materials on StudentBehavior1. Achievement2. Attitude3. Skill4. Interest5. Commitment
Y
B. Interim Effects of theMaterials on TeacherBehavior1. Attitude2. Interest3. Commitment4. Competency5. The community
Y
III. Contextual InformationA. Student Character-
isticsB. Teacher Character-
isticsC. School Character-
isticsY
D. Community Character-istics
E. Curricular Character-istics
Evaluation Standards
I Feasi-bility
Pro-priety
Accu-racy
P Y Y
P Y Y
P Y Y
KEY: Y= Yes, satisfactorily meets the criteriaP= Partially meets the criteriaN= No, does not meet the criteria
Specifically, it was judged feasible to investigate theinterim effects of the curriculum materials on students and teach-ers not using the curriculum. Project constraints also prevented
25
an in-depth collection of contextual information about student,teacher, school, and community characteristics. Project staffjudged information collection activities concerning the interim
effects of the curriculum materials on student and teacher be-
havior to be feasible. Specifically, the measurement of studentand teacher attitudes toward the curriculum as well as the mea-surement of student achievement appeared to be the most feasibledata collection activities on interim effects. Internal informa-
tion needs were, for the most part, being met by existing Prin-ciples of Technology formative evaluation activities.
The second stop in tie critical judgment process was to
assess the alternative formative evaluation approaches against theevaluation standards criteria. The information needs that werejudged as meeting the evaluation standards in the first assessment
were included under the utility standards of the second assess-
ment. The four standards were then crossed with the alternative
evaluation approaches. National Center staff assessed thesealternative approaches by assigning a numerical value to each cri-
terion. The results of this assessment are presented in table 4.
The results of the alternative evaluation approach assessmentrevealed that no single approach emerged as the optimal method forthe formative evaluation of the Principles of Technology curricu-
lum. It was decided that an eclectic approach would most likelyincorporate appropriate features of the approaches that best met
the evaluation criteria.
As shown in table 4, two approaches were judged to meet theevaluation criteria best: (1) local trial and field test and
(2) responsive evaluation. The local trial and field test ap-proach was judged more feasible than responsive evaluation.Conversely, responsive evaluation was judged more useful than thelocal trial and field test approach. The logical conclusion wasto combine those features of both approaches that best fit theneeds of the Principles of Technology formative evaluation effort.Consequently, the last step in the critical judgment process was
to compare the first-year evaluation effort approach with featuresof the local trial and field test and responsive evaluationapproaches so that elements of the eclectic approach could beidentified that would strengthen current evaluation efforts.
Selected Design Elementsto Strengthen Current Evaluation Efforts
When the first-year evaluation approach was compared with the
local trial and field test and responsive evaluation approaches,informati-In voids were identified that could be filled in the
second-year evaluation effort. Specifically, responsive evalu-ation could collect, through the case-study method, qualitative
26
36
TABLE 4
ASSESSMENT OF ALTERNATIVE FOFDATIVE EVALUATION
APPROACHES AGAINST EVALUATION STANDARDS
Alternative FormativeEvaluation Approaches
Evaluation Standards
Utility Feasibility Propriety Accuracy
TotalPoints
InformationNeeds
Practical Pro-cedures GivenConstraintsand EssentialFeatures
Rights ofHuman Sub-jects
DefensibleInformationSournes
Clinical Approach(One -to -One)
Small -Group TrielLocal Trial and
Field Test
1
3
3
1
3
5
E.
5
5
3
3
5
10
14
18
Four-Stage FormativeInstructional Quality
InventoryResponsive Evaluation
5
1
5
1
1
3
5
F
5
5
5
5
18
12
18
KEY: 5= Satisfactorily met the criteria3= Partially met the criteria1= Minimally met the criteria
contextual information that would be useful in helping to inter-pret the interim effects of the curriculum on students, teachers,and school-site staff. Interviews with teachers, students, andschool administrators, in particular, would enhance the second-year evaluation effort. Moreover, classroom observations anddocument reviews would provide additional insights to the interimeffects of the curriculum.
In essence, the first-year evaluation was similar to a localtrial and field test approach. Instructional units were submittedto subject-matter experts for content review. Although small-group trials were not conducted, the instructional units werefield tested in a variety of settinas (comprehensive high schools,vocational schools, teachers with varying academic and work ex-periences, students with varying academic and work backgrounds,and so forth). Students were pretested before each unit and were
27
3 1
administered a posttest and attitude questionnaire at the comple-tion of each unit. For each unit, teachers completed a detailedforced choice and open-ended questionnaire that allowed them torecord their daily reactions and suggestions for improving theunit.
National Center staff believed that the student and teacherquestionnaireE could be strengthened by focusing their data col-lection activities on the essential features of the curriculum. ALikert-type format was proposed. The data collection approachproposed by National Center staff for strengthening the Principlesof Technology formative evaluation is presented in appendix D.The data collection package was comprised of four major compon-ents: (1) a student reactionnaire; (2) a teacher reactionnaire;(3) a teacher unit daily log; and (4) case study guidelines forstudent, teacher, and administrator interviews, classroom obser-vations, and document and record reviews. The total package wasdesigned to fill the information voids of the first-year evalu-ation effort, given the project constraints and evaluation cri-teria.
28
38
CHAPTER 4
ASSESSMENT OF IMPLEMENTATION OF ESSENTIALEVALUATION DESIGN FEATURES
Assessment Design
National Center staff assessed the extent to which thecurriculum evaluators adhered to the National Center staff'sevaluation suggestions that were to be implemented in the second-yea: field-test of the Principles of Technology curriculum. Theassessment was guided by the four evaluation standards describedby the Joint Committee on Standards for Educational Evaluation(1981). In essence, the assessment framework formed a matrix withthe evaluation 'candards on one axis and the essential evaluationdesign features on the other axis. The assessment framework isdepicted in figure 5. A description of each assessment topic areais presented in the following sections.
Utility
Information scope and selection. The audit should assess theextent to which the formative evaluation addressed the informationneeds of the curriculum developers. Was the information collectedsufficient to support a judgment of worth and merit? Was theinformation useful? Were all relevant formative evaluation ques-tions answered by the information collected from the data collec-tion activities?
Clarity of information. The audit should assess the extentto which the formative evaluation information collected was under-standable to the curriculum developers. Did the informationrespond clearly and without generalities to the formative evalua-tion objectives? Did the information provide a firm foundationfor formative evaluation conclusions and recommendations? Was theinformation characterized by conciseness and logical development?In sum, did the information communicate to the formative evalua-tion audiences?
Timeliness. The audit should assess the extent to which theformative evaluation information was provided to the curriculumdevelopers at the time when the information could best be used.Was the formative evaluation information received by the curric-ulum developers on scheduled time line dates? Were any curriculumrevision decisions delayed as a result of late formative evalua-tion information? In sum, were the curriculum developers' mostcritical information needs met on time?
29
3J
Evaluation
Standards
Formative Evaluation Processes
Student
Question-
noire
Teacher
Question
noire
Unit
Deily Log
Inter-
views
Classroom
Observe-
tions
Document
end Record
Revi.ws
Utility
- Information
scope and
selection
- Timeliness
FeesibilitV
- Practical
procedures
- Political
- Viability
- Cost-effec-
tiveness
Propriety
- Right human
subjects
- Conflicts
of inter-
Set
- Balanced
reporting
AccurigV
- Clearly
defined
object
- Appropriate
informa-
tion
sources
- Adequacy of
contextual
informa-
tion
- Purposes and
procedures
explicated
Figure 5. Sources of informetion about the feasibility, utility,
accuracy, end proprietary characteristics of the formative
evaluation processes
30
4 0
Feasibility
practical procedures. Audit interviews and classroom obsr-vationS should assess the extent to which formative evaluationprocedures suggested by National Center staff were realistic giventhe time and financial constraints inherent in the formativeevaluation project. Specifically, were the National Center evalu-ation p-ocedures a practical approach to the formative eval,-..cionof the Principles of Tect,nology curriculum? Did these proceduresminimize school and classroom disruptions? Moreover, was theevaluation conducted on a daily basis without the presence ofcurriculum evaluators? Were teachers informed of the purpose anprocedure of the evaluation effort? Were teacaers instructed ofhow to implement t.le instructional units? Did the teachers fee'that they were well-prepared to teach the instructional units?Were the students informed of the evaluation purpose and proce-dures? Did the students feel that they were adequately preparedfor participating in the formative evaluation?
alitIcalmtali/ity. Audit interviews and classroomobservations should assess the extent to which the formativeevaluation procedures were politically viable given the variousinterest groups and stakeholders in the Principles of Technologycurriculum project. In essence, the formative ev luationprocedures were politically viable if the purpose of theevaluation was achieved despite pressures created by the formaland informal organizational power structures of the State andlocal levels. Specifically, did any political conflicts eruptover the evaluation effort? so, what impact did theseconflicts have on achieving t purpose of the evaluation?
Cost-effectiveness. The audit should assess the extent towhich the evaluation was cost-effective. The evaluation shouldprodace information of sufficient value to justify the expense ofthe evaluation. Were there alternative evaluation approachesthat would produce more useful information at the same or lesscost? Was the evaluation begun without a commitment of sufficientresources to ensure completion? For the information needed, wasthe evaluation conducted as economically as possible?
Propriety
Rights of human subjects. The audit should assess the extentto which the rights and welfare of human subjects were respectedand protected. Were legal, ethical, and common sense considera-tions followed?
Conflicts of interest. The audit should assess the extent towhjr.h conflicts of interest in the formative evaltation effortwere avoided. If conflicts of interest arose, how were they dealtwith so that the evaluation process and results were not compro-mised?
31
41
Balanced reporting. The audit should assess the extent towhich the formative evaluation was complete and fair in itspresentation of strengths and weaknesses. Were both negative andpositive aspects of the curriculum evaluation reported? Were anystrengths and weaknesses manipulated to please partisan interestgroups? Were any :valuation findings that might prove embarrass-ing to partisP.ii interest groups omitted from pilot test findings?
Accuracy
Clearly identified object. The audit should assess theextent to which the object of the formative evaluation was clearlyidentified. Given project constraints, was the object of theevaluation described realistically? Were unique features of theobject identified? Were the descriptions and unique features ofthe object valid characterizations of the object?
propriate information sources. The audit should assess theextent to which the formative evaluation provided adequate infor-mation to the curriculum developers. Were multiple informationsources utilized? Were information sources tapped with a varietyof methods, such as interviews, surveys, observations, and docu-ment reviews? In essence, to what extent did the informationsources answer the evaluative questions?
Adequacy of contextual information. The audit should assessthe extent to which contextual information was used in the inter-pretation of formative evaluation results. For example, thePrinciples of Technology curriculum developers should know whethera particular unit's success or failure was influenced by thestudents' socioeconomic status, teachers' background, students'academic background, school climate, teacher and administratorsupport or resistance toward the curriculum, and/or communitysupport c_ apathy toward the curriculum. In sum, what contextualfactors were examined, to what extent were these factors used indescribing the contextual conditions of the formative evaluationeffort, and how adequate was the information in helping to inter-pret the outcomes of the formative evaluation's
Purposes and mTocedures explicated. The audit should assessthe extent to which the purposes and procedures of the formativeevaluation were understood by those involved in the formativeevaluation effort. Were the objectives of the evaluation clear?Were procedures on how information was collected, organized,analyzed, and reported clearly delineated? If the evaluation wereto be replicated, are the purposes and procedures described insuffici.mt detail so that other evaluators could conduct a similarevalua'qon in other settings?
32
42
Furthermore, an overall assessment of the extent to which theessential features of the evaluation design were adhered to neededto be addressed. Tn essence, the overall assessment served as asummative evaluation of the information obtained from the discretecomponents of the design depicted in figure 4. Specifically, wereth( multiple data sources and data collection activities utilized?To what extent were the data collection procedures followed? Didall the field-test sites participate in the data collectionactivities? In sum, to what extent did the curriculum evaluatorsadhere to the essential evaluation features?
Data Collection
The data collection for the assessment of the evaluationsuggestions proposed by the National Center was done through thereview of appropriate information available at AIT headquarters,interviewing AIT evaluation staff members, observing Principles ofTechnology classes at two local PT pilot sites, interviewing astate-level liaison person, and interviewing students, administra-tors, and teachers in two local PT pilot sites. A copy of thetopics/questions used to guide the interviews is provided inappend. F. In general, the interviews were unstructured, witht). tol_cal areas and questions serving as a guide to ensure thatct ,ain topics were covered in the interviews, observations, anddocument or record reviews.
The timing of this task (initiated after AIT/CORD haddesigned and implemented the formative evaluation) and the factthat Principles of Technology pilot teachers were generally behindschedule in implementing units 8-14 meant that it was almostimpossible to secure meaningful feedback concerning NationalCenter suggestions for changes in the formative evaluation effort.The case studies being conducted by the AIT staff were in thepilot stage at the time data for this report had to be collected.
Assessment Summary
After sharing the proposed second-year formative evaluationapproach with AIT project staff, AIT responded with a revisedvczsion of the data collection instruments (appendix E). Thespecific changes made by the AIT project staff are noted as fol-lows.
33
43
o student Reactionnaire
--Unit Objectives - Items 1, 2, and 3 were reworded.
- -Unit Content - Items 5, 7, and 11 were deleted.
--Unit Posttest - Entire domain deleted.
--Unit Video - Items 16, 18, and 19 were deleted. Twoitems were added.
-Unit Text No changes.
- -Laboratory Activities - Items 26 and 27 deleted.
-Unit Difficulty - Item 30 deleted. Item 33 reworded.
--Student Satisfaction - Items 36 and 37 deleted. Item 38reworded.
o Teacher Reactionnaire
-Unit Objectives Items 2, 4, and 5 deleted.
-Unit Instructional Activities Items 12 and 13 deleted.
--Unit Content Items 14, 16, 17, and le deleted. Oneitem added.
--Unit Posttest - Entire domain deleted.
--Perceptions of Instructional Planning - item 35 deleted.
Furthermore, the student and teacher reactionnaires wererenamed as student and teacher questionnaires. After each domainitem set, a general commencs statement was added. No substantivechanges were made to the unit daily log and in the case studyguidelines.
Case Studies
Information about the implementation of th case studies wasbased on a pilot test of the case study procedu es by the AITevaluation staff at a Principles of Technology site in a largelocal comprehensive secondary school.
Utility. The AIT evaluation staff considered the informationcollected through interviews, classroom observations, and docu-ment/record reviews to be very useful. The information collectedadded depth of understanding to that collected from the question-naires. When data from the questionnaires and the case studies
34
44
were combined, the data were generally more understandable andmore logical. Considerable data were generated in the pilot casestudy. If the information is to be utilized optimally bycurriculum developers, it will need to be reduced to a conciseformat that communicates significant points easily. It may benecessary to concentrate the case study effort in the early partof the curriculum field test so that feedback can be provided tocurriculum developers in a timely manner.
Feasibility. Time and financial constraints are inherentdifficulties in conducting case studies. National Center staffhad suggested six case studies be conducted. Within the limits oftime and finances, the AIT staff developed plans to complete fourcase studies. The time and cost of conducting the case studieswas judged to be worthwhile, considering the high value placed onthe information produced by the case studies. The case studieswere particularly valuable in that they permitted evaluators toprobe various interest groups and stakeholders who did not receivequestionnaires.
Propriety. All sites and individuals involved in case stu-dies were assured anonymity. No legal or ethical concerns wereexpressed by case study participants. The individuals conductingthe case studies stressed the nature of formative evaluation,thereby lessening the potential for conflicts arising regardingthe evaluation activities. The case studies provided an excellentopportunity to present strengths and weaknesses of the PT curri-culum. Furthermore, the case studies allowed the evaluators theflexibility of being able to triangulate information on site.
Accuracy. At the time this report was written, none of thecase study reports had been completed. Based on National Centerstaff knowledge of the PT effort and case study methodology, itappeared that the material to be developed for the case studyreports reflected the reality of the situation in the localsites.
ayelojaplementation. The developer (AIT) pl.ans toconduct four case studies. National Center staff has recommendedsix case studies because of the enormous variation in teacherbackground, student recruitment, availability of laboratory equip-ment, and local or State support. Time and financial constraintsprevented the developer from conducting six case studies.
National Center staff considered it of vital importance forall of the individuals planning to conduct the case studies to beinvolved in a pilot case study so that the staff could reachconsensus on topical areas to be studied and the emphasis to beplaced on various aspects of the study, and, most importantly,could become more aware of the interpretations they place onvarious pieces of data. Two AIT staff whc, were to be responsible
35
45
i
for conducting the case studies participated in a 5-day pilot ofthe case study procedures. The pilot effort enabled the two AITstaff members to reach agreement on most procedures and techniquesto be used in succeeding case studies. Additional details werefinalized through the use of telaohone and mail communications.
Unit Daily Log. Teacher Questionnaire,and_StudentAuestionnaire
Very few of the PT pilot sites completed all seven unitsscheduled for the first year. This meant that the first part ofthe second-year program, which should have been devoted to units8-14, was devoted to units 6 and 7 in most situations. In orderto maintain continuity, the same evaluation instruments used forunits 1-7 during the first year of the PT pilot were used forunits 6 and 7 that were taught during the second year of the PTpilot. The changes proposed by the National Center staff andadopted by AIT were to be used in the formative evaluation ofunits 8-14.
Because of the delays in starting unit 8 in most schools,there was minimal information available concerning implementationof the revised instruments at the time this report was written.The following information was obtained from AIT evaluation staffand local school site personnel who had completed unit 8.
Utility. The curriculum developers were generally pleasedwith the usefulness of information they received from the instru-ments. There may be some redundant information, but it was deemedimportant to have confirming information from students and teach-ers. The information was provided in a timely manner. Ques-tionnaires completed by students and teachers were returned to thecurriculum developers upon the completion of a PT unit. Thetiming difficulties appeared to be more a case of teachers notprogressing through the units as rapidly as anticidated, ratherthan any difficulties associated with the formative evaluation ofthe curriculum.
Feasibility. National Center staff proposed several changesin the student questionnaire, teacher questionnaire, and the unitdaily log. Earlier in this section, a summary was provided indi-cating AIT staff disposition of these suggestions. The changes ininstrumentation were not perceived by teachers or students to beexcessive in terms of classroom disruption. Teachers did indicatethat completing the instruments was "a pain," but they understoodand appreciated the need for such informatipn. Teachers, adminis-trators, and students were unaware of any problems caused by theevaluation. National Center staff did propose adding to theinstruments a limited number of items dealing with teacher andin3t,uctional effectiveness as perceived by the students, however
3h
46
these items were eliminated by the AIT staff. A major shortcomingof the questionnaires was the lack of in-depth information aboutproblems identified by the respondents. The case studies contri-bute Substantially to correcting this situation by enabling AITstaff to collect in-depth information about PT curriculum prob-lems. The cost of conducting the case studies has limited thenumber of sites and the time spent on-site, but the informationgained from a limited number of case studies greatly enriches theformative evaluation data base.
Propriety. Respondents were assigned numbers, anu AIT staffmembers exercised considerable caution in maintaining the ano-nymity of sites and individual respondents. Care was exercised sothat excessive time would not be involved in completing the evalu-ation questionnaires. Open-ended questions allowed respondentsthe opportunity to identify strengths and weaknesses of the PTcurriculum.
Accuracy. Administrators, teachers, and students were awareof and understood the purpose of the formative evaluation activ-ities. As would be expected, some respondents were more enthus-iastic than others regarding the evaluation activities, but theredid not seem to be any threat to the accuracy of the evaluativeresponses. The formative evaluation objectives and activitieswere presented with sufficient clarity to permit their replicationby other evaluators, should such a need arise.
Developer implementation. The AIT/CORD staff conducted aworkshop for teachers designated to teach Principles of Technol-ogy. The evaluation activities and procedures were describedduring the workshop. In addition, teachers were mailed packets ofquestionnaires throughout the year. Each of these packets in-cluded explanatory material concerning the use of the instruments.Teachers were well-informed of the purposes of the formativeevaluation and of the procedures for administering the question-naires.
Conclusions concerning the formative evaluation of the Prin-ciples of Technology curriculum materials are presented in thefollowing chapter. Recommendations are also presented regardingthe planning and implementation of formative evaluation activi-ties.
37
4/
CHAPTER 5
CONCLUSIONS AND RECOMMENDATIONS
The conclusions and recommendations presented in this reportare based on task staff involvement with the formative evaluationof the Principles of Technology curriculum materials.
Assessment Conclusions
The following conclusions are based on (1) task staffinvolvement throughout the year with the formative evaluation ofthe Principles of Technology curriculum materials, (2) informationcollected from the ongoing AIT formative evaluation of the PTcurriculum materials, and (3) information collected from AIT staffand local pilot site personnel as they were implementing therevised formative evaluation procedures and instruments.
o The AIT evaluation staff was receptive to suggestions forimproving the formative evaluation effort, but it wasextremely difficult or impracticable to make certainchanges after the evaluation system had been operating fora full year.
o Individuals at State and local sites understood the needfor the formative evaluation and were receptive to changesin the evaluation that would result in better informationfor curriculum revision. However, there is a limit to theamount of time that evaluators can reasonably expect localand State personnel to devote to evaluation purposes,particularly when teachers are a major source of evaluativeinformation and also have the major responsibility forimplementing a new curriculum.
o Among those individuals with the evaluation effort, therewere no special concerns raised concerning legal, ethical,and common sense considerations.
o The most useful information collected by using the evalua-tion instruments came from open-ended questions addressedto teachers and students. The open-ended questions allowedteachers and students to explain why something was or wasnot working.
o The approach being used to evaluate the Principles ofTechnology curriculum materials provided a balance ofpositive and negative feedback to curriculum developers.The addition of case stueies provided an opportunity forprobing and in-depth description.
39
48
o In general, most individuals understood the major purposeof the formative evaluation and were willing to devotesufficient time so that the evaluation resulted in meaning-ful information for curriculum revision.
o The evaluation approach seemed to provide a sense of bal-ance concerning the positive and negative aspects of the PTcurriculum.
o The Principles of Technology curriculum was operating incomplex and diverse settings. The formative evaluationapproach did not provide sufficient information about thecomplexity of individual sites. The case studies partiallyremedied this situation, but their implementation did notoccur until the second year of the field test of PT curri-culum materials.
o It was probably a good public relations move to collectformative evaluation information from all PT sites. Thisenabled each State to develop a feeling of some ownershipof the evaluation information. However, from the stand-point of cost and quality of data collected, it would havebeen more desirable to have used a small sample (e.g., one-third of sites) and do more in-depth probing at selectedsites.
Recommendations for FormativeEvaluation of Curriculum Efforts
The following recommendations are based on task staffinvolvement with the design for the formative evaluation of thePrinciples of Technology curriculum effort, previous evaluationefforts, and reviews of relevant literature.
o The designing of a formative evaluation effort should bedone concurrently with the development of the curriculumand the development of strategies for field testing thecurriculum.
o The purpose and nature of formative evaluation should bethoroughly explained for all concerned parties so thatexpectations for results do not overburden or undercut theevaluation effort.
o The information needed for curriculum revision should beclearly identified at the time the curriculum developmentis being planned so that sufficient consideration can begiven to planning and implementing the evaluation effort.
40
49
o A combination of quantitative and qualitative data willmost likely provide curriculum developers with the informa-tion they nee] for revision purposes.
o Evaluation procedures should be designed so that key indi-viduals (e.g., students, teachers, and selected others)have an opportunity to provide information about the valueof the curriculum.
o The curriculum developer/evaluator and the curriculum users(teachers, students) should make use of the telephone orother means that permit two-way communication. However,this should not be construed as an acceptable substitutefor on-site, in-depth observations, interviews, anddocument/record reviews.
41
APPENDIX ACONTENT REVIEW TEAM
43
51
CONTENT REVIEW TEAMJohn BuschkoTechnical InstructorIntel CorporationChandler, AZ
Richard CasselOffice of Press and CommunicationsPennsylvania Department of EducationHarrisburg, PA
Joe ExlineAssociate Director of ScienceVirginia Department of EducationRichmond, VA
Darrell Parks, DirectorDivision of Vocational EducationOhio Department of EducationColumbus, OH
Richard PattonCoordinator of Instructional
Materials CenterOklahoma Department of EducationStillwater, OK
Phillip RollainProject CoordinatorNorth Carolina Department of Public
InstructionRaleigh, NC
Don TorneySuperintendent of Youth ProgrammingTVOntarioToronto, ON
Jim WilsonAssistant SuperintendentFrancis Tuttle Area VOTech CenterOklahoma City, OK
45
52
PRINCIPLES OF TECHNOLOGYInstructional Package for Unit 3: RATE
Reviewer Questionnaire
SECTION I - STUDENT HANDBOOK
1. What did you like most about the student handbook? Why?
2. What did you like least about the student handbook? Why?
3. Is the student handbook written at the appropriate level for11th and 12th-grade vocational students?
definitelyprobably
Comments:
probably notdefinitely not
4. Are the examples and illustrations used in the studenthandbook relevant to 11th and 12-grade vocational students?
definitelyprobably
probably notdefinitely not
5. Are there any errors or inaccuracies in the studenthandbook?
yes no
If yes, please specify, including page number:
49
54
6. Is the content of the loth labs useful for this unit?
_ definitely probably notprobably definitely not
7. Is the content of the bands-on-labs useful for this unit?
definitely probably notprotibly definitely not
8. Do you have any other comments, concerns, or suggestions forthe student handLiook?
+ II 4 Z L
1. Is Ole format of the teacher's guide (pages inserted next tothe pertinent page of the student handbook) appropriate forthe PRINCIPLES OF TECHNOLOGY course?
yes, definitely no, probably notyes, probably no, definitely not
If no, what format modifications would you suggest?
2. Is the information presented in the teacher's guideaccurate?
yes no
If no, list inaccuracies including page numbers:
50
3. Will the teacher's guide enable a teacher to successfullyimplement the unit?
definitelyprobablydon't know
If not, why not?
probably notdefinitely not
4. Do you have any other comments, concerns, or suggestions forthe teacher's guide?
SECTION III VIDEO PROGRAMS
1. Which video program did you like the m2st? Why?
2. Which video program did you like the least? Why?
3. Are the examples used in the video programs relevant to the11th -and 12th-grade vocational students?
yes no
If no, list your concerns. Do you have suggestions foralternative examples? If so, please list below:
4. Are the examples used in the video programs useful for thisunit?
yes no
If no, list your concerns and suggest alternative examples:
51
IML
56'
5. Is the content of the video programs accuurate?yes no
If no, list your concerns:
SECTION IV OVERALL
6. Do the student handbook and video programs augment oneanother?
yes no
Comments:
2. Will the materials (student handbook and video programs)increase students' understanding of RATE?
definitely
probably
Comments.:
probably not
definitely not
3. Overall, how would you compare the RATE unit to units 1 and2?
better worse about the same
If worse, why?
52
5'?
. In terms of tneir overall impact (instructionaleffectiveness, student interest, teacher manageability) rankof each of the components of the RATE unti using thefollowing scale:
A = ExcellentB = GoodC = So-soD = PoorE = Terrible
Place the letter, corresponding to your ranking, next to eachcomponent
Student Handbook hands on labs
Video teacher's guide
math labs
Please explain any C, D, or E rankings:
5. Do you have any other comments, concerns, or suggestions forthe unit on RATE?
Name
Consortium Agency
53
5S
Sex:
PRINCIPLES OF TECHNOLOGYUNIT ?: RATE
STUDENT ATTITUDE QUESTIONNAIRE
Female Male
Grade: 9 10 11 12
1. Overall, did you like the unit on RATE?
yes, a lot ne not very muchyes, a little no, not at all
2. What component did you like most in the RATE unit?
the written material the hands on labsthe video programs no preferencethe math labs
4. Overall, was the material that was covered in the RATE unitdifficult for you to understand?
yes, most of the material was difficult for me tounderstandyes, some of the material was difficult for me tounderstandno, most of the material was not difficult for me tounderstand
5. Which component of the RATE unit was the most difficult foryou to understand?
the written materialthe video programsthe math labs
the hands on labsno component wasparticularly difficult
7. Do you think the material in the RATE unit is important foryou to understand?
yes, very important no, not very importantyes, sort of important no, not at all important
8. Do you have any comments about the RATE unit?
THANK YOU!
54
5)
PRINCIPLES OF TECHNOLOGYSTUDENT BACKGROUND QUESTIONNAIRE
Grade: 9 10 11 12
Sex: Female Male
1. What is your area of concentration in vocational education?
electronics metal workinghydraulics/pneumatics general industrial artsauto mechanics other (please specify):woodworkingagriculture/horticulture
am not in vocationaleducationdo not have an area ofconcentration
2. Which of the following math cours have you taken or arecurrently ta:sing (check all that apply)?
general math trigonometryalgebra calculusgeometry other (please specify):
3. Which of the following science courses have you taken or arecurrently taking (check all that apply)?
physical science chemistrygeneral science physicsbiology other (please specify):
4. What do you plan to do after high school?
get a jobgo to a vocaticnal schoolgo to a junior collegego to collegeI don't know
THANK YOU!
55
6 0
PRINCIPLES OF TECHNOLOGYUNIT III: RATESTUDENT TEST
1. In the unifying definition for rate, rate is the relationshipbetween
a. forcelike quantity and distanceb. work and elapsed timec. displacementlike quantity and elapsed timed. voltage and charge
In the left column are rates that may or may not exist in one ofthe energy systems described in the right column. On your answersheet fill in the letter of the energy system that corresponds tothe numbered rate.
2. Mass flow (m/t) A. Rate in mechanical energy systems
3. Current (q/t) B. Rate in fluid energy systems
4. Acce7eration (v/t) C. Rate in electrical energy systems
5. Heat flow per D. Rate in thermal energy systemselapsed time (H/t)
E. Is not a rate in an energy system
6. Postal rate (cents/ounce)
7. Angular speed (0/t)
8. When using the Internal System of units (SI), mechanical rateis measured ina. newton meters (N.m)
b. newtons per second (N/s)
c. feet per second (ft/s)
d. meters per second (m/s)
9. Linear speed is
a. distance an object travels along a line in a unit oftime
b. always expressed in SI units
c. must include magnitude and direction
d. measured in radians per second
56
61
10. Which of the following is a linear rate?
a. helicopter blades in flight
b. hands moving on a clock
c. a jet airplane accelerating down a runway
d. all of the above
11. Which of the following involves an Angular rate?
a. helicopter blades in flight
b. ferris wheel
c. hands moving on a clock
d. all of the above
12. An increase in linear speed during an elapsed time isdescribed as
a. an average speed
b. an accelera-cion
c. a deceleration
d. a new linear speed
13. Angular speed is
a. always expressed in SI units
b. measured in radians per unit of time
c. distance an object travels along a line in a unit oftime
d. a vector quantity
14. When changed to radians per second, 4 RPM's is equal to(Note: Remember that 1 minute = 60 seconds and 2 radians =360.
a. 0.210 rad/sec
b, 0.418 rad/sec
c. 0.636 rad/sec
d. 1.272 rad/sec
57
62
15. A unit of rate associated with a fluid system is
a. gallons per minute
b. kilograms per second
c. cubic feet per minute
d. all of the above
16. Volume flow rate is
a. volume moved per unit of time
b. determined by the length of the pipe
c. always expressed in SI units
d. a linear rate
17. A car air conditioner pumps 5 kg of freon gas through thesystem in 3 minutes. The mass flow rate of the freon gas is
a. .60 kg/min
b. 1.65 kg/min
c. 15 kg/min
d. not enough information given to calculate
18. One coulomb per second is a unit of
a. electrical charge
b. electrical voltage
c. electrical current
d. electrical resistance
An AC current changes direction at the rate of 120 times persecond. Use this information to answer questions 19 and 20.
53
63
19. The frequency of the current is
a. 1/120 cycles per second
b. 1/60 cycles per second
c. 120 cycles per second
d. 60 cycles per second
20. The period of the current is
a. 1/120 seconds
b. 1/60 seconds
c. 120 seconds
d. 60 seconds
21. When an ammeter is placed in a circuit to measure currentthrough a resistor, it is placed in with theresistor
a. series
b. parallel
c. at right angles
d. none of the above
22. The term 60 Hz is an electrical
a. period
b. frequency
c. event
d. amplitude
23. Rate in an electrical system is
a. how fast charge flows through a conductor in a unit oftime
b. electrical current in coulombs
c. only measured in DC current
d. all of the above
59
64
24. Heat flow rate is
a. calories per second
b. BTUs per minute
c. heat energy per unit of time
d. all of the above
25. In the English system, heat flow rate is stated in
a. calories/second
b. joules/second
c. BTUs/hour
d. all of the above
26. The amount of heat energy required to raise the temperatureof a given body one degree is
a. latent heat
b. heat capacity
c. specific heat
d. sensible heat
27. How many units of heat energy are required to raise a unitmass a unit temperature difference defines
a. latent heat
b. heat capacity
c. specific heat
d. sensible heat
28. The heat capacity of 10 pounds of water is
a. 10 cal/C
b. 1 BTU/F
c. 10 BTU/hr
d. 10 BTU/F
6G
65
29. The calories per second of heat energy flowing through awindow pane does not depend on
a. thickness of glass
b. weight of the glass
c. surface area of the glass
d. type of glass
30. Water at 90 C to steam at 100 C is
a. latent heat
b. sensible heat
c. both latent and sensible heat
d. specific heat
61
66
PRINCIPLES OF TECHNOLOGYTEACHER PERCEPTIONS OF STUDENT CHARACTERISTICS
1. How many students are in your PRINCIPLES OF TECHNOLOGYclass?
2. Have you had any of these students in classes other thanPRINCIPLES OF TECHNOLOGY? yes no
If yes, which classes?
3. Are the students in your PRINCIPLES OF TECHNOLOGY class allfrom one school district? yes no
If no, how many school districts do they COME from?2 3 4 5 more than 5
4. Please briefly describe how students were selected for yourPRINCIPLES OF TECHNOLOGY class:
5. What is the ability level of your students? Please indicatethe approximate number of students in each category:
above average average below average
Comments:
6. What is the socioeconomic level of your students? Pleaseindicate the approximate number of students in eachcategory:
upper class lower middle classupper middle class lowermiddle class
7. What is the racial/ethnic composition of the class? Pleasespecify the number of each:
White OrientalBlack American IndianHispanic Other (please specify):
8. Please detail any other information about your class that youfeel is pertinent.
62
6'/
I
PRINCIPLES OF TECHNOLOGYUNIT III: RATE
TEACHER QUESTIONNAIRE
1. What did you like mgst about the RATE unit?
2. What did you like least about the RATE unit?
3. Overall, how would you compare tte RATE unit to units 1 and2?
better about the same worse
If worse, why?
4. In terms of their overall impact (instructionaleffectiveness, student interest, manageability) rank each ofthe components of the RATE unit using the following scale:
A = ExcellentB = GoodC = So-soD = PoorE = Terrible
Place the letter, corresponding to your ranking, next to eachcomponent.
student handbook hands on labsvideos teacher's guidemath labs
Please explain any C, D, or E rankings anYor list any othercomments you have about the components:
5. Which of your students seem to be the most successful in thePRINCIPLES OF TECHNOLOGY course?
above average below average
63
68
6. Based on your experiences, do you think the 6-day plan, persubunit, of 50-minute class sessions is realistic for thesubunits?
_ yes, definitely no, probably notyes, probably no, definitely not
If no, please explain:
7. On average, how much time did you spend preparing to teachea-h class in the urit on RATE?
0-30 minutes 91-120 minutes31-60 minute: 121-180 minutes61-90 minutes 181 or more minutes
Comments:
8. Overall, did you feel comfortable teaching the materials inthe unit on RATE?
yes, very comfortable no, not very cm-ortableyes, sort of no, hot at allcomfortable comfortable
It no, please specify:
9. Do you think most of your students did the assigned readingsat home?
definitely probably probably notdefinitely not
Comments:
10. What, if anythin, caused you the most problems in teachingthe unit on RATE?
11. Do you feel the Teacher's Guide material provided you withenough information to held you successfully implement theunit?
64
6J
12. Did you teach Unit III: RATE on consecutive days for 26days?
yes no
If no, what pattern did you use (for example, 3 days aweek)?
13. How much time per session did you teach?
50 minutes or less 61-90 minutes51-60 minutes 91+ minutes
14. Did you combine any classes into one session (for example,teach classes Cl and C2 in one session)?
yes no
If yes, which classes did you combine?
15. How many physics courses did you take in college(undergraduate and graduate)?
none 5-71 8 or more2-4
16. How many math courses did you take in college (undergraduateand graduate)?
___ none 5-71 8 or more2-4
17. Do you have any other comments, concerns, or suggestions forthe unit on RATE?
The following chart lists each activity for tAe unit on RATEdown the left column. Since there are no materials specificallyfor the sub-unit review classes, tnege classes have not beenlisted in the chart. For each activity, you should respond to thefollowing questions by circling "yes" or "no":
1. Was the material (readings, labs, or videos) appropriatefor your students? Was the material at the right gradelevel? Was the amount of material appropriate for yourstudents? For any no_L,514010A, please use the attachedpages to describe your concerns.
65
70
2. were you able to cover the material to your satisfactionin the 50-minute time period? (Since this questiondoesn't apply to the video, no response options havebeen provided for the column. Please do respond,however, to the other questions about the video.) Forany no responses, please use the attached pages todescribe why you could not complete the material and/,rwhat you chose to delete.
3. Were there any errors or inaccuracies in the material?For any yes responses, use the attached pages to specifythe errors and recommended corrections.
4. Were there any problems managing the activity? For thelabs, were all your students able to rotat,a through thelabs? Did you experience any problems coordinating thelabs? Did you experience any problems setting up ortearing down the labs? Did you experience any problemscoordinating the activity? For any yes responses, usethe attached pages to specify the problems you had and,if possible, suggest changes that you feel would enableyou to more easily manage the material.
5. Do you have any suggested modifications for theiaterial? For any yes responses, use the additionalpages to specify your suggestions. Include in thissection any "teaching tips"--special procedures you usedor means you discovered to more easily convey theinformation to students. Include in this section anycomments you may have for the Teacher's Guide.
We recommend that you take a few minutes each day to completethe chart and, most importantly, to write down your comments. Ifyou need more space for comments, use the back of the commentsWages and/or attach additional sheets.
G6 7i
UNIT III: RATE
Appropriate
for students?
Completed in
50 minutes?
Any errors or
inaccuracies?
Any management
problems?
Any suggested
modifications?
Overview
Video yes no -- yes no yes no y 88 no
CO yes no yes no yes no yes no yes no
Mechanical Systems
Video yes no yes no yes no yes no
Cl yes no yes no yes no yes no yes no
C2 yes no yes no yes no yes no yes no
Meth Lob yes no yes no yes no yes no yes no
3MI yes no yes no yes no yas no yes no
31e yes no yes no yes no yes no yes no
Fluid Systems
Video yes no yes no yes no yes no
Cl yes no yes no yes no yes no yes no
C2 yes no yes no yes no yes no yes no
Math Lab yes no yes no yes no yes no yes no
3F1 yes no yes no yes no /es no yes no
3F2 yqs no yes no yes no yes no yes no
Electrical Systems
Video yes no -- yes no yes no yes no
Cl yes no yes no yes no yes no yes no
yes no yes no yes no yes no yes no
67
72
UNIT III (continued)
Meth Lab yes no yes no yes no yes no yes no
3E1 yes no yes no yes no yes no yes no
3E2 yes no yes no yes no yes no yes no
Th3rmal Systems
Video yes no -- -- yos no yes no yes no
C1 yes no yes no yes no yes no yea no
C2 yes no yes no yes no yes no yes no
Moth Lob yes no yes no yes no yes no yes no
31'1 yns no yes no yr, no yes no yes no
372 yes no yea no yes no yes no yes no
Summary
Video yes no yes no yes no yes no yes no
68
73
Comments for Overiew Class
69
74
Comments for Videos
Overview Video:
Mechanical Systems Video:
Fluid Systems Video:
Electrical Systems Video:
Thermal Systems Video:
Summary Video:
70
Comments for Cl Classes
Mechanical Systems Cl:
Fluid Systems Cl:
Electrical Systems Cl:
Thermal Systems Cl:
71
%b b ,
Comments for C2 Classes
Mechanical Systems C2:
Fluid Systems C2:
Electrical Systems C2:
Thermal Systems C2:
72
7/
Comments for Math Lab Classes
Mechanical Systems Math Lab:
Fluid Systems Math Lab:
Electrical Systems Math Lab:
Thermal Systems Math Lab:
73
7s
Comments for Lab 1 Classes
Mechanical Systems Lab 1:
rluid Systems Lab 1:
Electrical Systems Lab 1:
Thermal Systems Lab 1:
74
7`)
Comments ,r. Lab 2 Classes
Mechanical Systems Lab 2:
Fluid Systems Lab 2:
Electrical Systems Lab 2:
Thermal Systems Lab 2:
75
Si)
APPENDIX C
FORMATIVE EVALUATION TIME LINE
77
8i
1984JFMAMJJASOND
1985 1986J 7-MAMJJASOND JFMAMJJASOND
Funding
Materials Devalo?ment
Materials Revisions
Video Production
TeacherWorkshop 1
L---IConsortiumMeeting 1
82
TeacherWorkshop 2
Pilot Test 1-7 Pilot Test 8-14L
ConsortiumMeeting 2
Information
Release1-7
C:73Release0-14
83
APPENDIX D
THE NATIONAL CEN7-11 SUGGESTED APPROACHFOR THE SECOND-YEAR FORMATIIE EVALUATION
81
84
Unit NumberPRINCIPLES OF TECHNOLOGY UNIT EVALUATION
STUD1NT REACTIONNAIRE
PURPOn:
Your answers to the questions on this form $ Al help improve this
Principles of Technology unit for other students. Your responses
to the following items are appreciated.
DIRECTIONS:
Below are a list of statements. Please read each statement and
answer whether you strongly agree, agree, disagree, or strongly
di-agree by t'hecking the appropriate box. Place a check mark ( )
in only one box for each statement.
Strongly StronglyAgree Agree Disagree Disagree
SNIT OBJECTIVRE
For this unit:
(1) The objectives wereclear on what I wasto learn
'2) I knew what I needed todo to achieve theobjectives
( ( )
( )
(3) I knew when I hadachieved a unitobjective
( ) ( ) ( ) ( )
83
85
Strongly StronglyAgree Agree Disagree Disagree
(4) What I learned and what
I was supposed to learn(as stated by the ob-jectives) were dif-ferent
UNIT CONTENT
The content of this unit:
(5) Was presented in alogical order. . . . ( ) ( ) ( ) ( )
(6) Had too much informa-tion ( ) ( ) ( ) ( )
(7) Used language difficultfor me to understand ( ) ( ) ( ) ( )
(8) Gave examples helpfulin understanding majorconcepts ( ) ( ) ( ) ( )
(9) Was difficult for meto understand. . . . ( ) ( ) ( ) ( )
(10) Will probably be usefulin future jobs . . ( ) ( ) ( ) ( )
(11) Did not have enoughexamples ( ) ( ) ( ) ( )
UNIT _PQM.T-ST
Tha posttest questions:
(12) Were covered in theunit instruction . . ( ) ( ) ( ) ( )
(13) Were clearly stated ( ) ( ) ( ) ( )
(14) Sampled importantpoints in the unit ( ) ( ) ( ) ( )
(15) Were generally fair ( ) ( ) ( ) ( )
84
86
Strongly StronglyAgree Agree Disagree Disagree
UNIT JDEO
The video programs for this
unit:
(16) Supported the text
(17)material ( ) ( ) ( ) ( )Used easy to under-stand graphics . . ( ) ( ) ( ) ( )
(18) Used language dif-ficult for me tounderstand . . . .
(19) Used theme music thatI liked
UNIT TEXT
The unit text materials:
(20) Helped me to achievethe unit objectives
(21) Will be a useful refe-rence after takingthe course . . .
(22) Used language difficult
(23) had enough examples tohelp we understand theimportatst concepts
LADORATORY_ACTIVITIES
The unit lab activities:
(24) Helped ma to achievethe unit objectives
(25) Were difticult .
(2F) Should not includemath labs
( ) ( ) ( ) ( ,
( ) ( ( ( )
( ) ( ; ( ) ( )
85
S
Strongly StronglyAgree Agree Disagree Disagree
(27) Take too long toset up the equip-ment ( ) ( ) ( ) ( )
( )
(28) Time periods werenot long enough tocomplete the work
(29) In general, werewell like
UNIT DIFFICULTY
For this unit:
(30) I did not haveenough time to domy work
(31) Some of tile thingsthe teach wantedme to learn werejust too hard . .
(32) I had troublereading the unittext materials. .
33) The teacher gaveme too much workto do ( ) ; ) ( ) ( )
'TUDENT SATISFACTION
For this unit:
(34) I usually had asense of satis-faction afterleaving classeach day
(35) I did not likecoming to class
36-
88
Strongly StronglyAgree Agree Disagree Disagree
(36) My classmates feltgood about whathappened in class
(37) I felt good aboutwhat happened inclass
(38) I would not recom-mend it to myfriends
( ) ( ) ( ) ( )
( ) ( ) ( ) ( )
( ) ( ) ( ) ( )
87
89
Unit Number:PRINCIPLES OF TECHNOLOGY UNIT EVALUATION:
TEACHER REACTIONNAIREPURPOSE:The primary purpose of this reactionnaire is to provide teacherinformation for facilitating improvement of the Principles ofTechnology curriculum. Specifically, this form is designed toassess the appropriateness of Principles of Technology unitobjectives and the appropriateness of instructional activities,resources, curriculum content, posttests, and instructionalplanning needed to accomplish these objectives.
DIRECTIONS:
Below are a list of statements. Please read each statement andanswer whether you strongly agree, agree, disagree, or stronglydisagree by checking the appropriete box. Place a check mark ( )
in only one box for each statement.
Strongly StronglyAgree Agree Disagree Disagree
UNIT OBJECTIVEa
The unit objectives:
(1) assist me in assessingstudent progress . . .
(2) The unit objectivesare difficult forme to use
(3) don't reflect whatthe unit is designedto teach
(4) assist students inknowing what isexpected of them . . .
(5) are worded toosim 'istic to beof value
( )
,( '
( )
( )
(
(
(
(
)
)
)
)
(
(
(
(
)
)
)
)
( )
( )
( )
1 I1t
38
90
Strongly StronglyAgree Agree Disagree Disagree
(6) help me Jeterminewhat to teach
( ) ( ) ( ) ( )
(7) are presented in alogical order
( ) ( ) ( ) ( )
UNIT INSTRUCTIONAL ACTIVITIES
The unit instructional activities:
(8) are related to unitobjectives
( ) ( ) ( ) ( )
(9) require mote than oneinstructor to teachthe unit content . . . ( ) ( ) ( ) ( )
(10) are pre rented inthe correct sequence
( ) ( ) ( ) ( )
(11) require a readinglevel that is toodifficult for moststudents
( ) ( ) ( )
(12) do not have enoughmath labs
( ) ( ) ( ) ( )
(13) take too much timefor most students tocomplete
( ) ( ) ( ) ( )
UNIT CONTENT
The unit content:
(14) was presented in alogical order ( ) ( ) ( ) ( )
(15) was too detailed . . ( )( ) ( ) ( )
(16) was too easy( ) ( ) ( ) ( )
(17) was just right . . . ( ) ( ) ( ) ( )
89
9i
Strongly StronglyAgree Agree Disagree Disagree
(18) used linguage thatwas difficult formost students tounderstand ( ) ( i
(19) contained examplesthat were helpful tostudents in theirunderstanding ofunit concepts ( ) ( ) ( ) ( )
(20) contained enoughexamples ( ) ( ) ( ) ( )
(21) provided enough sum-aries of importantpoints ( ) ( ) ( ) ( )
(22) provided informationthat will be usefulfor students in theirfuture employment. . . ( ) ( ) ( ) ( )
WIT POSTTEST
The unit posttest questions:(23) asked different things
than what had beentaught ( ) ( ) ( )
(24) were clearly worked. ( ) ( ) ( )
(25) covered all the impor-tant point in the unit ( ) ( ) ( )
(26) were generally fair. . ( ) ( ) ( )
pERcE -Tjaia_gLitimuCTIONAL_Eljuguyg.
(27) How many paid hours of planning (e.g., planning periads,after school) did you receive for planning and preparingmaterials for this ur0.t?Number of hours
90
92
Is this amount of time adequate?( ) Yes( ) No, I need additional hours.
(28) Approximately how much time did you expect students to spendon homework each day for this unit?
( ) None( ) About half an hour( ) About one hour( ) About two hours( ) More than two hours
(29) What percentage of students typically completed your homeworkassignments for this unit?
( ) Fewer than 25%( ) 50% to 74%
( ) 26% to 49%( ) 75% and above
(30) Do you feel the Teacher's Guide material p:.dvided you withenough information to help you teach the unit?
( ) Definitely ( ) Probably ( ) Probably not ( ) Definitely(31) Did you teach this UNIT on consecutive days for 26 days?
( ) Yes ( ) No
If no, what pat-ern did you see (for example, 3 days a week)?
(32) How much time per class section did you devote to this UNIT?
( ) 50 minutes or less( ) 61-90 minutes
( ) 51-60 minutes ( ) 91+ minutes
(33) Did you combine any of the unit's classes into one session(for example, teach classes Cl and C2 in one session)?
( ) Yes ( ) No
If yes, which classes did you combine?
(34) Based on your experiences, do you think the 6-day plan, persub-unit, of 50-minute class sessions is realistic for thisUnit?
91
93
( ) Yes, definitely ( ) No, probably not
( ) Yes, probably ( ) No, definitely not
(35) On average, how much time did you spend preparing to teacheach class in this Unic?( ) 1/4 hour ( ) 1 hour
( ) 1/2 hour ( ) 1 1/2 hours
( ) 3/4 hour ( ) 2 hours or more
Comments:
(36) Overall, did you feel comfortable teaching the materials inthis UNIT?
( ) Yes, very comfortable ( ) No, not very comfortable(
If) Yes, sort of comfortableno, please explain:
( ) No, not at all comfortable
(37) What, if anything, caused you the most problems in teachingthis UNIT?
(38) What did you like the most in teaching this UNIT?
92
94
UNITPRINCIPLES OF TECHNOLOGY
UNIT DAILY LOGPURPOSE:The purpose of the Unit Daily Log is to provide a continuoussource of teacher information for improving the Principlesof Technology curriculum.
DIRECTIONS:
The left column on the following charts list each activity forthis particular unit. Since there are no materials specificallyfor the sub-unit review classes, these classes have not beenlisted in the chart.
For columns 1-5 on the attached charts:
(1) Circle "Y" (yes) or "p" (No): Were the reading, labs,or videos appropriate (e.g., grade level, sufficientquantity of material) for your students? If not,specify modifications in column 5.
(2) Circle "Y" (yes) or "N" (No): Were you able to coverthe content in the 50-minute time period?
(3) Briefly describe any errors or inaccuracies in thematerial.
(4) Briefly describe any problems you had in managing thematerial. Amohg others, this may include problems incoordinating lab rotations, lab set-up, and maintainingstudent interest.
(5) Briefly list suggestions for modifying the material.Describe 'teaching tips" for future teachers.
We recommend that you take a few minutes each day to complete thecharts. If you need more space for comments, use the back of thecharts and/or attached additional pages.
93
96
913
Unit
(1)
Appropriate
ja..studente?
(2)
Completed in
50-50 minute?
(3)Did you discover any
errors or inaccuracies?
If so, please specify in
the appropriate spaces
e w
(4)
Did you have any
management problems?
If so, please specify in
the appropriate spaces
be.
(5)
Do you have any suggested
modifications or geneAal
comments? If so, please
specify in the appropriate
spaces below.CWERVIEW
Y VVideo
CO Y N Y N
MECHANICAL
Y N
SYSTEMS
Video
C1 Y N Y N
C2 Y N Y N
Meth Lab Y N Y 14:
MF1 Y N Y N
9/
alitMF2
(1)
Appropriate
forett.
Y N
(2)
Completed in
50-80 minutes?
(3)
Did you discover any
errors or inaccuracies?
If so, please specify in
the appr_drlate spaces
be,ow
(4)
Did you have any
management problems?
If so, please specify in
the appropriate spaces
.elow
(5)
Do you have any suggested
modifications or general
comments? If so, please
specify in the appropriate
7=Immag===.148"bet"
Y N
Video Y N
Cl Y N Y N
OP Y N Y N
Uni
Math Lab
(1) (2)
Appropriate Completed infor students? 50-60 min Les?
Y N Y N
(3)Did you discover anyerrors or inaccuracies?If so, please specify inthe appropriate spacesb 3low
(4)Did you have anymanagement problems?If so, please fAcify inthe appropriate spacesbelow
(5)Do you have any suggestedmodifications or generalcomments? If sot ()Leasespecify in the appropriatespaces below.
MF3 Y N
1)a Pc4
Video
Y N
Y N
Y N
Y N
10110)
Electrical
SI/etyma
(1)
Appropriate
._.12BLtgz...itsZ
Y N
(2)i
Completed in
50-80 minutes?
Y N
(3)
Did you discover any
errors or inaccuracies?
If so, please specify in
the appropriate spaces
below.
(4)
Did you have any
management problems?
If so, please specify in
the appropriate spaces
below.
(5)
Do you have any suggested
modifications or general
comments? If so, please
specify in the appropriate
spaces below,Cl
C2 Y N Y N
Math Lab Y N Y N
El Y N Y N
', Y N Y N
102
103
Thermal
Wters
Video
(1)
Appropriate
2.1rstuder,ts?
Y N
(2)
Completed in
50-60 minutes?
(3)
Did you discover any
errors or inaccuracies?
If 80, please specify in
the appropriate spaces
bolo',
(4)
Did you have any
management problems
If so, please specify in
the appropriate spaces
below
(5)
Do you have any suggested
modifications or general
comments? If so, please
specify in the appropriate
a cee below
C1 Y N Y ,..
CE Y N Y N
Meth Lab Y N Y N
Ti Y N Y N
T2 Y N Y N
1 C1 ".J
Thermal
Stej_u___.....arst......gutte50-80s
Summery
Video
(1)
Appropriate
Y N
(2)
Completed in
(3)
Did you discover any
errors or inaccuracies?
If so, please specify in
the appropriate spaces
.el m
(4)
Did you have any
management problems?
If so, please specify in
the appropriate spaces
below.
(5)
Do you have any suggested
modifications or general
comments7 If so. please
specify in the appropriate
s below.jam juipli.,=....
CASE STUDIES
PRINCIPLES OF TECHNOLOGY
Setting Boundaries
The primary purpose of the case studies is to provideinformation for facilitating improvement in the Principles ofTechnology curriculum. In view of the primary purpose of the casestudies the investigation will be limited to secondary schoolsites participating in the Principles of Technology prject. Themajor focus at the secondary school site will be the Principles ofTechnology class.
Selecting Sites
Purposeful sampling is to be done in selecting the sites.Six sites will be studied. In order to achievc a wide variety ofconditions, sites will be selected according to the followingcriteria:
o exceptionally good programso typical programso geographical variation
areas of the countryrural, urban, suburban
o programs serving special reeds groupso convenienceCaution will be exercised by not selecting politically sensi-
tive sites.
Once sites have been selected, appropriat_ approvals will besecured. The protocol network will be followed in securingapproval to conduct the case studies. While the protocol may varyby state, in general, the procedure will involve the following:
o state agency approvalo local agency approvalo local teacher approval
Confidentiality will be assured with regard to theidentification of state and local sites and individuals.
ar..Leloping Data Collection Procedures
Data will be collected by:
o Interviews. - The interviews are designed to provide 'n-depth qualitative descriptions of Principles of Tek-ilologyteacher, student, and administrator perceptions on the
100
106
following topics: objectives, content, learner activi-ties, teacher knowledge, and instructional techniques,posttests, instructional aids, equipment and materials,and an overall assessment of the Principles of Technologycurriculum. In addition, where appropriate, interviewswill be conducted with representatives of business/industry and members of citizen advisory committees.
o Observations - The observations are designed to providein-depth qualitative descriptions of Principles of Tech-nology classroom instruction.
o Document/Record Reviews - The reviews of printed or writ-ten aczounts should provide a picture on a stronger con-textual basis for what has happened in the Principles ofTechnology classes.
o Analysis of Extant- Data - The analysis of information fromthe student reactionnaire, teacher reactionnaire, the unitdaily log, pretests, and posttests provides additionaldata input for the case studies.
Potential Topics and Subtopics for GuidingInterviews. Observations. Document/RecordReviews and Data Recording Categories
These topics and subtopics can be used for focusing inter-views, observations, document/record reviews, ard the analysis ofextant data. They can also serve as a framework for coding thedata collected.
I. Objectives
A. Clarity of meaningB. Clar'.fied whaL was to be learnedC. Appropriate coverage of objer' e in affective, psycho-
motor, cor.itive and percep' -mainsD. Level of difficulty of obj ,s within each domain
II. content
A. Logical orderB. Correct level of difliculty (depth)C. Adequacy of amount or informationD. Appropriate language levelE. Meanir.gful examplesF. Sufficient examplesG. Clarity of directionsH. Relevant
101
109
III. Learner Activities
A. Relevancy to objectives and contentE. Appropriately sequencedC. Appropriate numberD. Efficient use of timeE. Problems created
IV. Teachel
A. Adequacy of knowledgeB. Effectiveness of instructional techniquesC. Quick response to questionsD. Availability to helpE. Quality of student-teacher interaction
V. Posttest
A. Covered important pointsB. Clearly wordedC. Fair
VI. Instructional Aids
A. Appropriate for objectives and contentB. Reinforced conceptsC. Interesting
VII. Equipment and Materials
A. Appropriate for objectives and contentB. Difficulty of useC. Safety problemsD. Difficult to locateE. Reasonable costF. Complexity of construction and/or fabricationG. Instruction time required
VIII. General
A. What did you like most about course?B. What did you dislike most about course?C. Were parts of course boring?D. Were parts of course confusing?E. Amount or homeworkF. Any additional comments about course?
102
110
DESCRIPTION OF TOPICS AND SUBTOPICS(INTERVIEWS)
I. Objectives
Since the steps of the Principles of Technology instructionaldevelopment process are driven by unit objectives, the first taskis to detemine the adequacy of these objectives. Interviews withteachers can provide reliable information about the appropriate-ness of the unit objectives in addition to providing evidence asto their worth.
Specifically, teachers can provide useful information about theobjectives' clarity of meaning. For instance, did the teachersand students understand the objectives? Do the objectivesaccurately reflect what the student is supposed to do or know?Can the objectives be operationalized? Moreover, teachers candetermine if the unit objectives appropriately covered theaffective, psychomotor, and cognitive domains. Teachers can alsoprovide useful insights about the level of difficulty of theobjectives within each of these domains. In sum, interviews withPrinciples of Tecnnology teachers should, in part, focus onobtaining information that can be used to judge the adequacy ofunit objectives.
II. Content
During the teacher interviews, information about the content ofeach Principles of Technology unit should be obtained. Teacherscan provide useful content analysis input about the appropriate-ness of each unit. For example, it v'ould be useful to know if theteachers believe that the unit content is presented in a logicalorder. Moreover, do the teachers believe that the unit is at thecorrect level of difficulty? Is the language level of the unitsadequate? Insights tc the adequacy of unit examples, graphics,and clarity of directions would a]so assist in the verification ofunit content.
III. Learner Activities
During the teacher interview, information about the relevancy ofthe learning activities to unit objectives and content should beobtained. Are t!,e math and hands-on laboratory activities appro-priately sequenced? Are the number of learning activities appro-priate? The teachers should be permitted to express their opin-ions about the effectiveness of the learning activities. Were theactivities an efficient use of time? What classroom managementproblems did the teachers encounter with the learning activities?Teacher suggestions for improving the learning activities for eachunit should also be obtained.
103
111
IV. Teacher
The teachers should be asked to comment on their knowledge of theunit subject matter. Is their knowledge adequate to teach thematerial? Teachers should be asked to comment on the effective-ness of the instructional techniques that are suggested in theteacher's guide. Moreover, given the math and hands-on laboratoryactivities, can the teachers respond quickly to student questions?Likewise, during the laboratory activities, are teachers able togive sufficient individual student help?
V. Posttest
Teachers should be asked if the posttest covered the importantpoints of the unit. Did the posttest ask different things fromwhat had been taught? Were the posttest items clearly worded andwere the items fair? Suggestions for improving the posttestshould be obtained.
VI. Instructional kids
Information about the appropriateness of the instructional aidsshould te obtained during the teacher interview. Were the aidsappropriate for the objectives and content? Did the video, stu-dent workbook, math exercises, and lab demonstrations reinforceunit concepts? Did the instructional aids maintain studentinterest? Suggestions for improving instructional aids should beobtained.
VII. Equipment and Materials
Information about the appropriateness of laboratory equipment andinstructional materials should be obtained during the teacher in-terview. Were the equipment and materials appropriate for accom-plishing unit objectives? What procedural or management problemsdid the teachers encounter with the equipment and materials? Askthe teachers to nott any safety problems they discovered with theequipment. Were needed equipment and laboratory materials avail-able when it came time to use them? Information on the complexityof constructing laboratory set-ups and/or fabricating laboratorymaterials would be helpful in determining the difficulty of e^uip-ment and material use. Is the cost reasonable for unit equipmentand materials? Was the time allotted for instruction sufficientto accomplish laboratory learning activities?
104
112
VIII. General
During the interview, teacher perceptions of what they liked anddisliked most about the unit/course should be obtained. Teachersshould note what parts of unit/course were boring and confus-ing. Was the amount of homework required sufficient? Did stu-dents do their homework? Teachers should be permitted to make anyadditional comments about the unit or course as they see fit.
DESCRIPTION OF TOPICS AND SUBTOPICS(OBSERVATIONS)
I. Objectives
During the classroom observation, observer impressions of unit ob-jectives should be noted. Did the students understand the impor-tance of the objectives? Were the objectives clear as to what thestudents were supposed to do or learn? The observer should noteif there was an appropriate coverage of objectives in the affect-ive, psychomotor, and cognitive domains. Furthermore, within eachof these domains, were the objectives appropriate as to theirlevel of difficulty' Observer impressions of the usefulness ofunit objectives in enhancing classroom learning should be noted.
II. Content
Observer impressions of the unit content should be noted. Speci-fically, did the unit appear to be at the correct level of diffi-culty? Did the language level appear to be appropriate for themajority of the students? Was the amount of information containedin the unit adequate? The observer should try to get a feel forthe appropriateness of unit examples. Did the examples appear tobe sufficient and meaningful? Moreover, did the students appearto understand directions that were stated by the teacher? In-sights on the clarity of student workbook directions would behelpful as well.
III. Learner Activities
Observer impressions on the relevancy of learning activities toobjectives and unit content should be noted. Were the learningactivities appropriately sequenced? Were there an appropriatenumber of activities planned? The observer should also notewhether the learning activities were an efficient use of time.What problems did the instructor and students encounter during thelearning activities that were observed?
105
113
IV. Teacher
Did the instructor appear to have an adequate amount of knowledgeof the subject area? Observer impressions of instructor's in-structional techniques should be noted. Was the instructor ableto respond quickly to student questions? Was the instructoravailable to help all students who needed help? Overall impres-sions of the instructor's command of the class may also be noted.
V. Posttest
Specifically, the observer should attempt to determine if the unitposttest covered important points taught in class. Observer im-pressions of whether the posttest is fair and clearly wordedshould also be noted.
VI. Instructional- Aids
Impressions of whether the instructional aids (video, graphics,math problems) are appropriate for unit objectives and contentshould be noted. Are the students interested in the instructionalaids? Do the instructional aids appear to reinforce unit con-cepts? The observer should note any problems the students orteachers encountered when using the instructional aids.
VII. Equipment and Materials
Observer impressions of the laboratory equipment and materialsshould be noted. Specifically, the observer should note if theequipment and materials are appropriate for unit objectives andcontent. Were the lab equipment and materials easy to set up anduse by the students and teachers? The observer should note anysafety problems that the equipm-at and materials may cause. More-over, was the instruction time allotted for the laboratory activi-ties sufficient? Overall impressions of the lab activities wouldbe welcome.
VIII. c;eneral
Overall impressions of the classroom observation should be noted.What did the students like best about the unit? What did the stu-dents dislike most about the unit? Did the teacher or studentsappear to be interested? Were they bored? What parts of the unitdid the students and teachers appear to find most interesting?Did the teacher or students become confused during the instructionperiod? If so, what caused the confusion? Any additional com-ments that the observer has about the classroom observation shouldbe noted.
106
114
DESCRIPTION OF TOPICS AND SUBTOPICS(DOCUMENT/RECORD REVIEWS)
I. Objectives
Included in the document/record review should be the r;:viewer'simpressions of the unfit objective development process.Specifically, were the objectives developed by those knowledgeableabout the subject matter needed by secondary vocational educationstudents? Did the objective development process ensure that anappropriate coverage of unit objectives was included in theaffective, psychomotor, and cognitive domains? Moreover, withineach of these domains, were the objectives developed at anappropriate level of difficulty? Development procedures designedto clarify the meaning of each objective and the criteria used todetermine what the student needs to know should be noted.
II. Content
The reviewer should examine content development proceduresdesigned to ensure that unit content is presented in a logicalorder and at the correct level of difficulty and language level.What rationale did the developers follow to determine the adequacyof the amount of unit information? Reviewer impressions of eachunit's clarity of directions, meaningfulness of unit examples, andthe number of examples should be noted. In sum, what proceduresdid the curriculum developer follow to ensure appropriate unitcontent?
III. Learner Activities
The reviewer should examine the learning activity developmentprocess. What steps were taken by the developer to ensure thatlearner activities are relevant to unit objectives and content?Documentation that provide evidence that the learner activitiesare appropriately sequenced, are of an appropriate number, and arean efficient use of time should be noted. Records that indicateproblems encountered by learner activities should be noted aswell.
IV. Teacher
Records pertaining to teacher background should be reviewed. Dothe teacher:, have adequate knowledge to teach Principles ofTechnology? Moreover, what evidence exists that teachers makeeffective use r instructional techniques?
107
115
V. Posttest
The reviewer should examine procedures used in developing posttestitems. What steps did the developer take to ensure that posttestitems covered important unit points, that items are clearlyworded, and that items are fair? In sum, are the unit posttestsvalid representations of unit content?
VI. Instructional Aids
The reviewer should examine the instructional aid developmentprocess. Were steps taken by the developer to ensure thatinstructional aids are appropriace for unit objectives andcontent? Evidence that the instructional aids reinforce unitconcepts and are interesting to the students and teachers shouldbe noted.
VII. Equipment and MatLrials
The reviewer should ascertain whether the equipment and materialsare appropriate for unit objectives and content. Specifically,records pertaining to difficulty of use, safety problems,availability, cost, construction complexity, and time required foreffective instruction should be examined.
VIII. General
After reviewing documents and records pertaining to the Principlesof Technology development process, the reviewer is encouraged tomake general comments about the curriculum. For example, what didthe reviewer like and dislike most about the curriculum? Whatparts of the units were boring or confusing? What is anappropriate amount of homework? The reviewer should feel free tomake additional comments, specific or general, about thecurriculum or the curriculum development process.
103
116
(SAMPLE)INTERVIEW NOTE SHEET
SITE:DATE:TIME:INTERVIEWER:
INTERVIEWEE:NAME:TITLE:ORG.:STREET:CITY:SIP:TELEPHONE:
INTERVIEW NOTES
CODE NARRATIVE
169
111
SITE:TEACHER:CLASS:OBSERVER:DATE:
CLASSROOM OBSERVATIONS
OBSERVATIONS (NARRATIVE)Notes (Feelings or impres-sions about what was doneor said)
(This section should containonly the account of what isactually observed or said. Nojudgments or evaluations shouldbe listed in this side of theobservational record.)
(This section should containthe observer's interpreta-tion of what was seen and/orheard.)
110
118
SITE:TEACHER:CLASS:REVIEWER:DATE:
Document/Record Title:
CODE INFORMATION
111
119
Qrgamiz,..j.g_ bat,
The case study data must be stored in a format that makes iteasily retrievable. The coding system should be one which (1)has the data easily available for analysis, (2) is the least timeconsuming, (3) is easy to implement, and (4) is cost effective.
The coding system should be designed so that each piece ofdata will be classified. A scheme needs to be established foridentifying the school, the evaluator collecting the informationat the site, the type of person (teacher, etc.) being interviewedor observed, the kind of document/record reviewed, and the datacollection procedure (interview, observation, etc.). Eachinterview, observation, or document/record reviewed should beassigned a sequential number.
Each bit of information should be recorded on a separateindex card or be retrievable from a computer. These small piecesof data can then be shuffled around in order to look at theinformation from different perspectives and to detect emergingthemes.
Staff Training
It is crucial that a training session be conducted for allstaff to be involved in the case studies. The first step todesigning staff training is to identify the needs of the staff.The following questions might be used to identify and prioritizestaff training needs.
STAFF TRAINING NEEDS
Rank Order Should staff training include:
b 1) Background information on thepurpose of the case study?
2) Clear definitions of terms andconcepts to be used?
3)
4)
Discussion of what to do beforeentering the site? [ ] [ ]
Discussion of what to do when onsite?
[ ] [ ]
112
120
5) Procedures to be used for the casestudy?
[ ] [ ]
6)
7)
8)
An explanation of the coding system?[ ] [ ]
Discussion/review of interview,observation, and document/recordreview methods?
[ ] [ ]
Practice with interview, observation,and document/record review methodsthrough structured activities (e.g.,role playing)? [ ] [ ]
9) Practices with interview, observation,and document/record reviews in thefield?
[ ] [ ]
10) Practice using the coding system? [ ] [ ]
11) Opportunity to modifya) case study procedures?b) coding system?c) persons to be interviewed?
12)
13)
14)
[ ] [ ]
[ ] [ ]
[ ] [ ]
[ ] [ ]
[ l [ ]
[ ] [ ]
A staff training session will probably cover a minimum of3 days. Following is a sample staff training agenda.
SAMPLE STAFF TRAINING AGENDA
DAY 1
AM: Background and purpose of case studiesReview of site selection processReview of schedvle and work assignmentsDiscussion of logisticsDiscussion of topics and subtopicsDiscussion of coding system, including coding exercise
113
12.E
PM: ititeiNiewing procedures and techniquesPractice interviewing (role playing)Obsenration procedures and techniquesPractice observing (role playing)Reviewing documents and recordsExerc'se on document/record reviewUsing extant data
DAY 2
AM: Practice interviewing, observing, and document/recordreviews at sample site
PM: Write up field notes and code informatiol
DAY 3
AM: Evaluate field activityConduct data analysis to detect :merging trends, issues,problems
PM: Review procedures ana rev:3e, if neededReview of assignments
Logistics of Fieldwork Operations and Data Collection
Following is a potpourri of do's and don'ts as one gets readyto enter the field.
Scheduling. Interviews and observations should be scheduledwell in advance of site visit. If this is accomplished by tele-phone there should be a follow-up confirming letter. Daily sched-ules should be developed, remembering that at least as muchtime is needed to write up notes and code information as is neededto collect it.
Selecting interviewees, observations. And document/records.Selection should be mad^ by the evaluators. Site personnel maymake suggestions, but the final decisions should be made by theevaluators.
Recording responses. Decisions should be made about whetherwritten notes or tape recording is required.
Participation. The primary concern is to collect data. Theamount of participation should be such that the evaluator fitscomfortably into the setting without disturbing or interferingwitn normal operations.
114
122
When in Rome . . . . Learn the language or jargon used atthe site. This is essential if the evaluator is to be able topick up on insights and clues as to how people at the site viewand define situations, events, and things.
Interviewing suggestions. The following suggestions are madefor conducting the interviews:
o Use an open-ended interviewing technique.To the extent possible, let the interviewee introduce whathe/she considers relevant. The interviewer should set thecontext and then conduct the interview as a dialogue, nota question-by-question process of acquiring informationfrom the interviewee.
o Establish a context for interviewing.- Secure a private room for the interview.- Ai-sure confidentiality.- Try to avoid group interviews.- Be ready when the interviewee enters room. Have note
pad, pens, recorder, handouts ready-- Set a tone that conveys the importance of the inter-
view.- Keep the interview within prearranged time frame.
o Use successful interviewing techniques.- Be a good listener- Pursue in detail: seek examples and illustrations.- Take notes rapidly.- Be attentive.- Seek clarification.- Keep questions brief.- Begin interview with simpler questions.- Do not breach confidentiality.
o Potential Probes- Pause- Repetition of question or part of question- "What do you mean?" or "How do you mean?"
"Would you tell me morc about your thinking on that?""What do you think?" or 'What do you expect?"
- "Which would be closer to the way you feel?" or "Whichwould be closer?"
- "Are there any other reasons why you feel that way ?"
115
123
o Neutral Prefaces"Of course no one knows for sure . . .""Of course there are no right or wrong answers . . .""Of course there are no right, or wrong imswers; we'reinterested in people's opinions . . ."
"We're just interested in what you think . . .""We all hope, but . . .""Let me repeat the question . . .""Well, in general . . .""Generally speaking . . .""Overall . . ."
"In the Principles of Technology project as a whole..."- "Yes, but . . ."
o Carefully record data.Take extensive notes and/or record interview.Expand and complete notes as soon as interview iscompleted.Code notes.Type and edit notes.
Observations suggestions. The following suggestions are madefor conducting the classroom observations:
o The observations should be of a nonparticipatory nature.The teacher and students should be aware that they arebeing observed, but the observer should not become amember of the group in the sense that he/she has a stakein the group outcomes.
o Observation involves inspection and contemplation of whatone sees and hears. This necessitates au understanding ofthe cont ., in which the Principles of Technology classoperates and a recognition of the emotions and feelingsshaping what one sees and hears.
o Extensive notes should be taken during the observationperiod.
o Following the observations, the notes should be expandedand coded according to the topic and subtopic areas.
o Be especially aware that the rights and privac3 of obser-vees are not sacrificed.
116
124
ument /rec review The following sugges-tions are made for reviewing documents and records.
o Examples of records and documents that might be reviewedinclude audits, supervisory reports, classroom expenses,.rests, test results, teacher and student evaluations ofPrinciples of Technology units, letters, memoranda,newspaper articles, tapes of radio broadcasts about theclass, editorials, teacher generated materials, and soforth.
o Documents and records irequentiv provide a valuable senseof the context in which a program or class operates. Theyshould be reviewed prior to conducting observations andinterviews.
o "What is said", "How it is said", and "When it is said"are important considerations in all data collection,including document and record review.
o The reviewer needs to be cognizant of data that are per-sonal, private, or classified. Caution must be exercisedso as not to create special ethical problems.
Analyzing Data
The evaluator in the field begins the data analysis processas soon as the first data are collected. The evaluator will belooking for recurrent themes and areas where a greater concentra-tion of effort is required. As data are collected and simultan-eously analyzed, the evaluator will find that inferences will bedrawn and new questions raised, and themes will develop that willadjust the scope, focus, and schedule of interviews, observations,and document/record reviews accordingly. Throughout the datacollection phase, the data are continuously triangulated throughvarious sources, kinds of data, and so forth.
It is important that the data be analyzed while they arefresh and clear in the evaluator's mind. This process involvesthe evaluator reading through all of the data in order to becomefamiliar with the information obtained. This reading of all ofthe information collected is essential to confirm patterns andthemes that emerged as the data were being collected and to iden-tify new patterns and themes.
117
125
Reporting the Findings
The individual(s) responsible for collecting the data at agiven site should be responsible for writing the report for thatsite. The individual(s) who have been on-site will have a muchbetter feel for the data and will be able to provide a bettersense of reality. The report writer may find the followinghelpful:
REPORTING THE FINDINGS
Did I: Yes No
1) Explain the background (e.g., purpose,limitations).
[ ] [ ]
2) Describe the site completely.[ ] [ ]
3) Use accurate, detailed, descriptions ofactions and behaviors.
[ ] [ ]
4) Report word-for-word statements whereappropriate.
[ ] [ ]
5) Report whether interaction with informantswas effective and whether informantswere reliable.
[ ] [ ]
6) Has data been used from all possiblesources (e.g., interviews, observations,record/document reviews, extant data,other)?
[ ] [ ]
7) Did evaluator's presence and/or actionscreate disruptive or unnatural conditions? [ ] [ ]
8) Have evaluator's pre-formed opinions undulyinfluenced data interpretation?
[ ] [ ]
9) Do interpretations flow directly fromthe data?
[ ] [ ]
Enhancing case study report utilization. Potential forutilization should be a concern throughout the design, implementa-tion, and reporting of the case study. Some suggestions forenhancing the probability of using case study findings are:
o The people for whom the information is intended shouldbe involved in the initial stages of the planning. Itis essential that they be involved in clarifying whattheir information needs are.
o Develop summary reports oriented to the needs of speci-fic audiences.
o Conduct meetings with potential users so that evaluatorsand potential users can interact about various aspectsof the report.
118
126
APPENDIX E
SECOND-YEAR INSTRUMENTSUSED BY AIT/CORD
119
127
Unit IPrinciples of Technology Unit Evaluation
Student Questionnaire
Grade Sex Student ID ft
PURPOSE
Your honest answers to the questions on this form will help improve this Principles of Technology unit for otherstudents. Your responses to the following items are appreciated.
DIRECTIONS
Below are a list of statements. Please read each statement and answer whether you strongly agree, agree, disagree, or stronglydisagree by checking the appropriate box. Place a check mark [4] in only one box for each statement
Strongly StronglyAgree Agree Disagree Disagree
UNIT OBJECTIVES
For this unit
1. the objectives helped itle understand what I wassupposed to learn [J Il I] El
2. I'm glad the objectives are printed in the writtenmaterials [ 1 [ 1 [ ] [1
3. I used the objectives to guide me through thematerial [J [ ] [l H
General comments on Unit Objectives:
UNIT CONTENT
The content of this unit
4. had too much information [ ] [ ] [ ] H
5. gave examples helpful in understandingconcepts [ ] [ ] [ ] [ ]
6. was too difficult for me to understand [ ] [ ] [ ] [ ]
7. will probably be useful in future jobs [ ] [ ] [ ] [ ]
General comments on Unit Content:
121
128
Strongly StronglyAgree Agree Disagree Disagree
UNIT VIDEO
The video programs for this unit
8. helped me to betty understand the textmaterial [ ] [ ] [ ] []
9. were interesting [ ] [ ] [ ] [ ]
10. used easy to understand graphics [ ] [ ] [ ] [ ]
11. helped me to achieve the unit objectives [ ] [ ] ( ] [ ]
General Comments on Unit Video:
UNIT TEXT
The unit text materials:
12. helped me to achieve the unit objectives
13. will be a useful reference after taking the
[ ] [ ] [ ] []
course [ ] [ ] H []
14. used language difficult for me to understand. [] [] [] []
15. had enough examples to help me understand theimportant concepts [] [] [] []
16. helped me to understand the unit concepts [ ] [ ] [ ] []
General comments on Unit Text
LABORATORY ACTIVITIES
The unit lab activities:
17. helped me achieve the unit objectives [ ] [ ] [] []
18. were difficult [ ] [ ] [ ] [ ]
19. time periods were not long enough to completethe work [ ] [ ] [] []
122
1 2 j
Strongly StronglyAgree Agree Disagree Disagree
20. in general, I hied them.
General comments on Laboratory Activities:
UNIT DIFFICULTY
For this unit
21. some of the things we were expected to learnwere just too hzrd [ l [ l [1
22. I had trouble reading the unit text materials [ l [ I [ l [1
General comments on Unit Difficulty:
STUDENT SATISFACTION
For this unit
23. I usually had a sense of satisfaction after leavingclass each day [ [1 [ [1
24. I did not like coming to class [ l [1 [ l [1
25. I would recommend it to my friends [ l [ l [ l [1
General comments on Student Satisfaction:
123
130
Unit *
Teacher
PURPOSE
Principles of Technology Unit EvaluationTeacher Questionnaire
The primary purpose of this questionnaire is to provide teacher information for facilitating improvement of the Principlesof Technology curriculum. Specifically, this form is designed to assess the appropriateness of Principles ofTechnology unit objectives and the appropriateness of instructional activities, resources, curriculum content, andinstructional planning needed to accomplish these objectives.
DIRECTIONS
Below are a list of statements. Please read each statement and indicate whether you strongly agree, agree, disagree,or stronglydisagree by checking the appropriate box. Place a check mark [4] in only one box for each statement.
Strongly StronglyAgree Agree Disagree Disagree
UNIT OBJECTIVES
The unit objectives:
1. assist me in assessing student progress E l E l E l [ ]
2. don't reflect what the unit is designed to teach H (1 E1 H
3. help me determine what to teach [ 1 [ 1 (1 H
4. are presented in logical order [ ] [ ] (1 H
General comments on Objectives:
UNIT INSTRUCTIONAL ACTIVITIES
The unit instructional activities:
5. are related to unit objectives H H [1 [1
6. are presented in the correct sequence [ ] [ l [ ] H
7. require a reading level that is too difficult for moststudents [1 [1 [1 H
General comments on Instructional Activities:
124
131
19. Based on your experiences, do you think the 6-day plan, per sub-unit, of SO- minute class sessionsis realistic for this unit?
( ) Yes, definitely [ I No, probably not( ) Yes, probably (I No, definitely not
20. Overall, did you feel comfortable teaching the materials in this unit?
( ) Yes, very comfortable ( ] NA not very comfortable( I Yes, sort of cc.-7fortable ( I No, not at all comfortable
If no, please explain:
21. What, if anything, caused you the most problems in teaching this unit?
22. What did you like the most in teaching this unit?
General comments on Perceptions of Instructional Planning:
23. Did you teach this unit on consecutive days for 26 days?
[ ] Yes H No
If no, what pattern did you use (for example, 3 days a week)?
24. How much time per class session did you devote to this unit?
[ I 50 minutes or less ( ) 61-90 minutes( ) 51-60 minutes [ ) 91+ minutes
25. Did you combine any of the unit's classes into one session (for example, teach classes Cl and C2in one session)?
[ ] Yes [ ] No
If yes, which classes did you combine?
125
132
I
Strongly StronglyAgree Agree Disagree Disagree
UNIT CONTENT
The unit content:
8. was too detailed.
9. was presented at the proper level of difficulty__
10. contained examples that were helpful to students in
[ ]
[ ]
[ ]
[ ]
[ ]
[ ]
[ ]
[ ]
their understanding of unit concepts [] [] [ ] El
11. contained enough examples [] [] [ ] []
12. provided enough summaries of important points [] [ ] [ ] []
13. provided information that will be useful for studentsin their future employment. El El [ ] []
General comments on Unit Content
PERCEPTIONS OF INSTRUCTIONAL PLANNING
14. On average, how many hours did you spend preparing to teach each lesson in this unit?
15. Approximately how much time, on average, did you expect students to spend on homew
for this unit?
[ ] None[ ] About 15 minutes[ ] About half an hour
[ ] About one hour[ ] Two hours or more
16. What percentage of students typically completed your homework assignments fo
[ ] Fewer than 25% [ ] 50-74%[ ] 26-49% [ ] 75% and above
17. What percentage of students typically completed your homework assignmenyou teach?
[ ] Fewer than 25%[ ] 26-49%
[ ] 50-74%[ ] 75% and abo e
ork each day
this unit?
is for other courses that
18. Do you feel the Teacher's Guide material provided you with enough information to help you teach
the unit?
[ ] Definitely [ ] Probably [ ] Probably not [ ] Definitely not
If not, what should be added to the guide to make it more useful?
126
13
26. Were thee any special circumstances which, in your opinion, may have influenced the pre-or posttestscores? (e.g. lack of time; faulty lab equipment; school holidays; fire drill in the middle of the exam)
General Comments:
127
134
Unit
Teacher ID I
PURPOSE
Principles of TechnologyUnit Daily Log
The purpose of the Unit Daily Log is to provide a regular source of teacher information for improving the Principles of
Technology curriculum.
DIRECTIONS
The left column on the following charts lists each activity for this particular unit. Since there are no materials specificallyfor the sub-unit review classes, these classes have not been listed in the chart.
For columns 1-5 on the attached charts:
1. Circle "r (yes) or "N" (no): Were the readings, labs, or videos appropriate (e.g., grade level, sufficient quantityof material) for your students? If not, specify modifications in column 5.
2. Circle (yes) or "N's (no): Were you able to coves the content in the 50-minute time period?
3. Briefly describe any errors or inaccuracies in the material.
4. Briefly describe any problems you had in managing the material. Among others, this may include problems incoordinating lab rotations, lab set-up, and maintaining student interest.
5. Briefly list suggestions for modifying the material. Describe "teaching tips" for future teachers.
We recommend that you take a few minutes each day to complete the charts. If you need more space for comments, use theback of the charts and/or attach additional pages.
128
13j
t., 'lit
OVERVIEW
Video
CO
01
Appropriatefor students?
Y
Y
WAVE CHARAC-TERISTICS
H VideoN
1/4s)
CI
Y
Y
[2]
Compkted in50060 minutes?
DI
Did you discover any errors orinaccuracies? If so, pleasespecify in the appropriatespaces below.
[4]
Did you have any managementproblems? If so, please specifyin the appropriate spaces below.
[51
Do you have any suggested mod-ifications or general comments?If so, please specify in the appropriatespaces below.
Y N
C2 Y
Math Lab Y
Lab 9-1 Y
Lab 9-2 Y
Y N
Y N
Y N
Y N
136 137
Hwc)
lidit
[1]
Appropriatefor students?
WAVE APPLI-CATIONS
Video Y N
Cl Y N
C2 Y N
Math La Y N
Lab 9-3 Y N
Lab 9-4 Y
13d
(2j
Completed in50-60 minutes?
[31
Did you discover any emirs orinaccuracies? If so, pleasespecify in the appropriatespaces below.
[41 [51
Did you have any managementproblems? If so, please specifyin the appropriate spaces below.
Do you have any suggested mod-ifications or general comments?If so, please specify in the appropriatespaces below.
Y N
Y N
i
Y N
Y N
Y N
13 :)
APPENDIX F
ASSESSMENT INTERVIEW GUIDE
131
14U
BUGGEETED AUDIT INTERVIEW QUESTIONSKey:
Code:
S()= Student QuestionnaireTQ= Teacher Questionnaire
UDL= Unit Daily LogI= Interviews
CO= Classroom ObservationsDR= Document Reviews
S= Student InterviewT= Teacher Interview
SA= School AdministratorInterviewAgency for InstructionalTechnology Representativeview
U= UtilityF= FeasibilityP= ProprietyA= AccuracyDI= Developer ImplementationExample: U.2= Clarity of Information
F.1= Practical ProceduresP.3= Balanced Reporting
UTILITY (U)
1. Information Scope and Selection (T.AIT1
A. To what extent did the SQ, TQ, UDL, I, CO, DR address the4 information needs of the curriculum developers?
B. Was the information collected by the SQ, TQ, UDL, I, CO,DR sufficient to support a judgment of merit or worth? Ifso, to what extent? If not, what were the shortcomings?
C. To what extent was the information collected by the SQ,TQ, UDL, I CO, DR useful? To what extent was the infor-mation collected redundant or meaningless?
D. To what extent were the formative evaluation questionsanswered by the information collected from the six datacollection activities?
2. Clarity of Information (AIT)
A. To what extent was the information collected by SQ, TQ,UDL, I, CO, DR understandable to the curriculum develop-ers?
B. To what extent did the information collected by SQ, TQ,UDL, I, CO, DR respond clearly and without generalities tothe formative evaluation objectives?
133
141
C. To what extent did the information collected by SQ, TQ,UDL, I, CO, DR provide a firm foundation for formativeevaluation conclusions and recommendations?
D. To what extent was the information collected by the sixdata collection activities characterized by concisenessand logical development?
E. In sum, how well did the information collected by the sixdata collection activities communicate to the formativeevaluation audiences?
3. Timeliness (AIT)
A. To what extent was the information collected by SQ, TQ,UDL, I, CO, DR, received by the curriculum developers onscheduled timeline dates?
B. How many curriculum revisions were delayed as a result oflate formative evaluation information collected by SQ, TQ,UDL, I, CO, DR? What were the causes of these delays?
C. In sum, to what extent were the curriculum developer'smost critical information needs met on time?
FEASIBILITY (F)
1. practical Procedures (T. A. AIT)
A. To what extent were the SQ, TQ, UDL, I, CO, DR sugges-tions made by National Center staff for strengtheningthe normative evaluation realistic given time andfinancial constraints inherent in the formativeevaluation project''
B. Were the SQ, TQ, UDL, I, CO, Dr evaluation procedures apractical approach to the formative evaluation of thePrinciples of Technology curriculum? To what extent wereschool and classroom disruptions minimized?
2. Political Viability (AIT)
A. To what extent were the evaluation procedures politicallyviable given the various interest groups and stakeholdersof the Principles of Technology curriculum project?
B. Did any political conflicts erupt over the evaluationeffort? If so, how were these conflicts handled? Whatimpact did these conflicts have on achieving the purposeof the evaluation?
134
142
3. Cost Effectiveness (MT)
A. To what extent did the evaluation procedure information ofsufficient value to justify the expense of tue evaluation?
B. Were there alternative evaluation approaches that wouldproduce more useful information at the same or less cost?If so, please describe.
C. Was the second year evaluation effort begun without acommitment of sufficient resources to ensure completion?If so, what data collection activities had to becompromised?
D. For the information needed, to what extent was theevaluation conducted as economically as possible?
PROPRIETY (P)
1. Rights of Human Subjects (S. T. A. AIT)
A. To what extent were the rights and welfare of theparticipants who participated in the evaluation respectedand protected?
B. To what extent were legal, ethical, and common senseconsiderations followed?
2. Conflicts of Interests (T. A. AIT)
A. To what extent were conflicts of interest in theevaluation effort avoided? If conflicts of interestarose, how were they dealt with so that the evaluationprocess and results were not compromised?
3. Balanced Reporting (AIT)
A. To what extent was the evaluation complete and fair in itspresentation of strengths and weaknesses? Were bothnegative and positive aspects of the curriculum evaluationreported? Give examples.
B. To what extent were e-;aluation findings that might proveembarrassing to partisan interest groups omitted fromformative evaluation findings? In other words, whatstrengths and weaknesses were manipulated to pleasepartisan interest groups?
135
143
ACCURACY (AC)
1. Clearly Identified Object (AIT)
A. To what extent was the object of the formative evaluationclearly identified?
B. Given project constraints, to what extent was the objectof the evaluation described realistically?
2. Purposes and Procedures Explicated (S. T. A. AIT)
A. To what extent were the purposes and procedures of theformative evaluation understood by you?
B. Were you clear on the objectives of the evaluation?
C. To what extent were procedures on how information wascollected, organized, analyzed, and reported clearlydelineated?
D. If the evaluation were to be replicated, were purposes andprocedures described in sufficient detail so that otherevaluators could conduct a similar evaluation in othersettings. If not, what do you suggest for improving thepurposes and procedures?
DEVELOPER IMPLEMENTATION (DI)
1. Preparation (S. T. A. AIT)
A. To what extent were you (S, T, A, AIT) prepared for theformative evaluation?
B. To what extent were you (teachers) prepared to implementthe evaluation instruments?
136
144
1
REFERENCES
Abedor, A. "Development and Validation of a Model Explicatingthe Formative Evaluation Process for Multi-Media Self-Instructional Learning Systems." Ph.D. dies., Michigan StateUniversity, 1972.
Baker, E., and Alkin, M. "Formative Evaluation of InstructionalDevelopments." Audiovisual Communication Review 21, 1973.
Bastian, R.; Edward, K.; Medsker, K.; and Schimmel, B. "Forma-tive Evaluation in Industry." Performance and Instruction_Journal, June 1983: 20-22.
Carey, J. "A Fair Return on Your Formative evaluation Dollar."Tallahassee: Florida State University, 1978. Duplicatedmanuscript.
Center for Occupational Research and Development. "An Implemen-tation Handbook for Principles of Technology (ProvisionalVersion)." Waco, TX: CORD, 1 March 1985. Unpublishedmanuscript.
Dick, W., and Carey, L. The Systematic Design of Instruction.Glenview: Scott, Foresman and Company, 1978.
. The Systematic Design of Instruction. Glenview: Scott,Foresman and Company, 1985.
Joint Committee on Standards for Educational Evaluation. Stan-dards for Evaluation of Educational _Programs. Projects. andMaterials. New York: McGraw-Hill Book Company, 1981.
Kandaswamy, S.; Stolovitch, H.; and Thiagarajan, S. "LearnerVerification and Revision: An Experimental Comparison of TwoMethods." Audiovisual Communication Review 24, 1976:
Lowe, A.; Thurston, W.; and Brown, S. "Clinical Approach toFormative Evaluation." Performance and Instruction Journal22 (June 1983): 8-10.
Montague, W.; Ellis, J.; and Wulfeck, W. "Instructional QualityInventory." Performance and Instruction Journal 22, June1983: 11-14.
Provus, M. "Evaluation of Ongoing Programs in the PublicSchools." In Educational Evaluation: New Roles. New Means(Part II), edited by R.W. Tyler. Chicago: National Societyfor the Study of Education, 1969.
137
145
Sanders, J., and Cunningham, D. "A Structure for FormativeEvaluation in Product Development." Review of EducationalResearch 43, 1973: 217-236.
Sarapin, M. "Formative Evaluation: A Model for InstructionalMaterial Development and Revision." Journal of IndustrialTeacher Education 18, 1981: 3-13.
Scriven, M. "The Methodology of Evaluation." In Perspectives ofCurriculum Evaluation: AERA Monacraph Series on CurriculumEvaluation, edited by R.E. Stake. Chicago: Rand McNally,1967.
Stake, R. Evaluating the Arts in Education: A ResponsiveApproach. Columbus, OH: Charles E. Merrill, 1975.
Stufflebeam, D. "The Use and Abuse of Evaluation in Title III."Theory Into Practice 6, 1967: 126-133.
. "Evaluation for Decision Making." In Improving Educa-tional Assessment and an Inventory of Measures of AffectiveBehavior, edited by W. H. Beatty. Washington, DC: NationalEducation Association, 1969.
Stufflebeam, D.; Foley, W.; Gephart, W.; Guba, E.; Hammond, R,;Merriman, H.; and Provus, M. Educational Evaluation andDecision Making. New York: Peacock, 1971.
Wager, J. "One-to-One and Small Group Formative Evaluation."Performance and Instruction Journal 22, June 1983: 5-7.
S.Government PrInting °MC* 1986 647-530/30012
138
140