View
216
Download
1
Category
Tags:
Preview:
Citation preview
Teaching for Fluency with Teaching for Fluency with Information Technology:Information Technology:The Role of Feedback in The Role of Feedback in Instructional Design and Instructional Design and Student AssessmentStudent Assessment
Explorations in Instructional Technology
Mark Urban-Lurain
Don Weinshank
October 27, 2000
www.cse.msu.edu/~cse101
OverviewOverview
Context: Teaching non-CS majors Instructional Design Uses of technology Results Implications
Fluency with Information Fluency with Information TechnologyTechnology
What does it mean to be a literate college graduate in an information age?
Information technology is ubiquitous “Computer Literacy” associated with training Being Fluent with Information Technology (FIT)
Committee on Information Technology Literacy, 1999
CSE 101 MSU introductory course for non-CS majors
Instructional Principles Instructional Principles 1. Concepts and principles promote transfer within domain
Necessary for solving new problems
2. “Assessment drives instruction” (Yelon) “Write the final exam first”
3. Move focus from what is taught to what is learned Student-centered
4. Formative evaluation improves student performance Study – test – restudy – retest
5. Performance assessments evaluate mastery of concepts High inter-rater reliability critical
6. Mastery-model learning ensures objectives met What students can do, not what they can say
Uses of Technology in Uses of Technology in InstructionInstruction
Delivery of content Television CBI / CBT Web-based
Communication E-mail Discussion groups Real-time chat
Feedback and monitoring Formative evaluation Iterative course development and improvement
`Incoming Students
Instruction
Assessment
Outcomes
Design Inputs
Instructional Goals
Instructional Design
Design Phase
Implementation Phase
Course Design & ImplementationCourse Design & Implementation
Discriminant AnalysisDiscriminant Analysis
Multivariate statistical classification procedure Dependent variable: final course grade Independent variables
Incoming student data Classroom / instructional data Assessment performance data
Each student classified in group with highest probability Evaluate classification accuracy
Interpret discriminant functions Independent variable correlations with functions Similar to interpreting loadings in Factor Analysis
3
6
7
5
4
2
FootnoteModify styleWeb formatPrivate folder
New SSChartsFunctions
Path
Find application
TOC
URL
Public folder
Update SS
Boolean search
Skill to Schema MapSkill to Schema Map
LinkWeb search
Find rename file
Computer specs.
SIRS: Course, TA, ATASIRS: Course, TA, ATA
End of semester student survey about course Three Factors
“Fairness” Student preparation and participation Course resources
SIRS for Lead TA One Factor
SIRS for Assistant TA One Factor
Fairness FactorFairness Factor
35.3% of variance on this factor accounted for by: Final grade in course TA SIRS Number of BT attempts ATA SIRS Cumulative GPA ACT Mathematics Computer specifications Incoming computer communication experience
Participation FactorParticipation Factor
19.8% of variance on this factor accounted for by: TA SIRS Attendance ATA SIRS ACT Social Science Number of BT attempts Create chart Incoming knowledge of computer terms ACT Mathematics Find - rename file Path TOC
NO course grade
Course Resources FactorCourse Resources Factor
11.3 % of variance on this factor accounted for by: TA SIRS Attendance ATA SIRS Extension task: backgrounds Web pages in Web folder Number of BT attempts
NO course grade
Lead TA SIRSLead TA SIRS
27.8 % of variance on Lead TA SIRS accounted for by: Fairness factor Preparation and participation factor TA Experience Course resources factor ATA SIRS Attendance Private folder Extension task: excel function Number of BT attempts
NO course grade
Assistant TA SIRSAssistant TA SIRS
13.4 % of variance on ATA SIRS accounted for by: Fairness factor Preparation and participation factor Student E-mail factor TA SIRS Course resources factor Attendance Path TA Experience
NO course grade
Technology in Instructional Technology in Instructional Design and Student AssessmentDesign and Student Assessment
Data-rich instructional system Detailed information about each student CQI for all aspects of instructional system
Performance-based assessments Labor intensive Inter-rater reliability Analyzing student conceptual frameworks
Intervention strategies Early identification Targeted for schematic structures
ImplicationsImplications
Instructional design process can be used in any discipline Accreditation Board for Engineering and Technology CQI
Distance Education Demonstrates instruction at needed scale On-line assessments How to provide active, constructivist learning on line?
Questions?Questions?
CSE 101 Web site
www.cse.msu.edu/~cse101
Instructional design detail slidesInstructional design detail slides
Design Inputs• Literature
• CS0• Learning• Assessment
• Client department needs• Design team experience
Design InputsDesign Inputs
Instructional Goals• FITness• Problem solving• Transfer• Retention• No programming
Instructional GoalsInstructional Goals
Deductive InstructionDeductive Instruction
Concept
Skill 1 Skill 2
Skill 3
Schema 1 Schema 2
Schema 3
Inductive InstructionInductive Instruction
Skill 1
Skill 3
Concept
Skill 2
Instructional Design• 1950 students / semester• Multiple “tracks”
• Common first half• Diverge for focal problems
• All lab-based classes• 65 sections• No lectures
• Problem-based, collaborative learning• Performance-based assessments
Instructional DesignInstructional Design
Incoming Students• Undergraduates in non-technical majors
• GPA• ACT scores• Class standing• Major• Gender• Ethnicity• Computing experience
Incoming StudentsIncoming Students
Instruction• Classroom staff
• Lead Teaching Assistant• Assistant Teaching Assistant
• Lesson plans• Problem-based learning
• Series of exercises• Homework• Instructional resources
• Web, Textbook
InstructionInstruction
Assessment• Performance-based• Modified mastery model• Bridge Tasks
• Determine grade through 3.0• Formative• Summative
• Final project• May increase 3.0 to 3.5 or 4.0
AssessmentAssessment
Bridge Task Competencies Bridge Task Competencies in CSE 101in CSE 101
1.0 E-mail; Web; Distributed network file systems; Help 1.5 Bibliographic databases; Creating Web pages 2.0 Advanced Word-processing 2.5 Spreadsheets (functions, charts); Hardware; Software 3.0Track A
Advanced Web site creation; Java Applets; Object embedding
3.0 Track C Advanced spreadsheets; Importing; Data analysis; Add-on tools
3.0 Track D Advanced spreadsheets; Fiscal analysis; Add-on tools
Bridge Task Detail Drilldown 1Bridge Task Detail Drilldown 1
BridgeTask (BT)Database
• Each Bridge Task (BT) has dimensions (M) that define
the skills and concepts being evaluated.
• Within each dimension are some number of instances
(n) of text describing tasks for that dimension.
• A bridge task consists of one randomly selected
instance from each dimension for that bridge task
Dim 1Instance i
Instance i+1Instance i+2Instance i+n
Dim 2Instance i
Instance i+1Instance i+2Instance i+n
Dim MInstance i
Instance i+1Instance i+2Instance i+n
Creating AssessmentsCreating Assessments
BridgeTask (BT)Database
Dim 1Instance i
Instance i+1Instance i+2Instance i+n
Dim 2Instance i
Instance i+1Instance i+2Instance i+n
Dim MInstance i
Instance i+1Instance i+2Instance i+n
Dim 1Instance 1
Criteria iCriteria i+1Criteria i+2Criteria i+n
Dim 2Instance i+2
Criteria iCriteria i+1Criteria i+2Criteria i+n
Dim MInstance i+n
Criteria iCriteria i+1Criteria i+2Criteria i+n
Student EvaluationPASS or FAIL
Evaluation CriteriaEvaluation Criteria
Bridge Task Detail Drilldown 2Bridge Task Detail Drilldown 2
Web Server
Student Enters: Pilot ID
PIDPW
StudentRecordsDatabase
Submits to Create Query
BridgeTask (BT)Database
RequestNew BT
Dim 1Instance i
Instance i+1Instance i+2Instance i+n
Dim 2Instance i
Instance i+1Instance i+2Instance i+n
Dim MInstance i
Instance i+1Instance i+2Instance i+n
Randomly select one instance from each of M dimensions for desired BT
Assemble Text
Dim 1 (i+1)Dim 2 (i+n)Dim M (i)
Web Server
IndividualStudent
BT Web PageReturns
Delivering AssessmentsDelivering Assessments
StudentRecordsDatabase
Grader Queuing Create Query
BridgeTask (BT)Database
RequestCriteria
Dim 1Criteria i
Criteria i+1Criteria i+2Criteria i+n
Dim 2Criteria i
Criteria i+1Criteria i+2Criteria i+n
Dim MCriteria i
Criteria i+1Criteria i+2Criteria i+n
Provide criteria for instances used toconstruct student’s BT
IndividualStudent
BT ChecklistReturns
Grader evaluateseach criteria
PASS or FAIL
StudentBridge
Task
Dim
1
Cri
teri
a
Dim
2
Cri
teri
a
Dim
M C
rite
ria
StudentRecordsDatabase
Record PASS / FAIL for each criteria
Evaluating AssessmentsEvaluating Assessments
OutcomesOutcomes
Outcomes• Student final grades• SIRS• TA feedback• Oversight Committee: Associate Deans• Client department feedback
Recommended