Upload
kayo
View
37
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Literacy Clinical Teacher Preparation that is Transformative. Literacy Research Association Dallas, TX December 4, 2013. Four Distinct Research Studies. Transfer & Transformation of Teachers in Clinic: Longitudinal review of the 30 Cases across 6 years. Collective Case Study Approach - PowerPoint PPT Presentation
Citation preview
LITERACY
CLINICAL TEACHER PREPARATION THAT IS TRANSFORMATIVE
Literacy Research Association Dallas, TX
December 4, 2013
1
2
FOUR DISTINCT RESEARCH STUDIES
Transfer & Transformation of Teachers in Clinic: Longitudinal review of the 30 Cases across 6 years. Collective Case Study Approach
Video for Assessment: Use of video clips to analyze students’ reading performance by experts, graduate students and novices.
Formative Design Study of Assessment Protocols & Rubrics
Video for Teacher Reflection: Analysis of video tasks across sites
Design-based in Year #1; This year: Cross-Case Analysis
iPad Use in ClinicsMixed methods (qualitative & quantitative) across five sites.
3
SUMMARIZING ACROSS SITES Teachers take what they learn in Reading Clinic into their
classrooms/schools. Analyzing videos of children’s reading, with guidance
from rubrics, is helpful in assessing teachers’ knowledge of literacy processes.
Using video with specific directions, debriefing, and collaborative inquiry deepens teachers’ reflections.
Technology use by teachers is becoming more pedagogically powerful in Reading Clinics.
LITERACY CLINIC CASES
Literacy Research Association Dallas, TX
December 4, 2013
Evan Ortlieb, Monash University (Australia)Julie Gray, University of Virginia (Virginia) Tammy Milby, University of Richmond (Virginia)Barbara Laster, Towson University (Maryland)Stephen Sargeant, Northeastern State University (Oklahoma)
4
5
HISTORY OF THE CASES PROJECT 2006-2007
Interview study of 28 graduates Identified 5 areas of clinic transfer (instruction, assessment,
coaching, leadership, technology) 2009-2010
In-depth interviews of nine clinic graduates 2010-2011
Add nine additional graduates, including two new sites New projects based on transfer/transformation findings
2011-2012 Add five additional graduates, continuation of work on “Transfer” and “Transformation” Research Team Examines:
Habits of Mind Clinical Experiences that Matter & Recommendations Disjunctures
2012-2013 Three additional cases added, review of all 30 cases in entirety
6
PURPOSE/RATIONALE Literacy Clinic: Research to practice-
Explore ways in which clinic/lab graduates transfer clinic/lab practices to schools
Explore ways in which graduates take on literacy leadership roles in schools
Understand how the clinic/literacy lab experience supports literacy leadership
Investigate current instructional & assessment practices transferred including national trends
Understand clinic/lab role in multiple paths to leadership
7
CONTEXT & THEORY Roles of literacy professionals changing (Bean, et al., 2002) Coaching and leadership in the forefront (Walpole &
McKenna, 2004) Little research on preparation of literacy professionals
(Anders, et al., 2000) Growing condemnation of teacher preparation (Darling-
Hammond, 2000, 2006; Duncan, 2009) Leadership is a key component of educational reform
(Middlebrooks, 2004) Teachers need support to navigate mandates, enhance
skills as literacy leaders/coaches, & reflect on best practices (Ortlieb & Cheeks, 2013)
Research needed on literacy professional preparation that leads to both effective teaching and effective leadership
Training vs. Teaching (Hoffman & Pearson, 2000) Guided practice opportunities are essential
8
RESEARCH QUESTIONS 1.) What transfers from the literacy lab to
educational contexts? 2.) How have trends, leadership, and
disjunctures changed across time?
9
METHODOLOGY Identical methodology across new & existing sites
Maryland (1 site)Virginia (2 sites)
Each researcher followed a 3 phase process: Phase 1
Initial screening interview Collection of artifact to represent practice (graduate chosen) Screening observation
Phase 2 In-depth interview Targeted observation
Phase 3 Follow-up/Retrospective interview
10
DATA ANALYSIS 3 Phases using Constant Comparative Method
(Glaser & Strauss, 1967; Strauss & Corbin, 1990) Phase 1: Identified instances where graduates
focused on Aspects of clinic/lab that supported their development
(Clinic Experience) Current practices in classrooms that drew from participation
in clinic (Transfer, Transformation) Phase 2: Grouped instances from Phase 1 into like
categories within clinic experience and transfer; created codes for categories
Phase 3: Confirm/reconfirm categories (Miles & Huberman, 1994); collapsed into broader themes
Member checking
11
2012-2013 FINDINGS :Text Selection/Text Types
Program Fidelity versus Student Needs
Beginning of Common Core Implementation
Assessments
Instructional Insights
Confidence
Flexibility & Masterful Reflection
12
PARTICIPANT QUOTES:Amy, Reading Specialist, Maryland
They took Bud, Not Buddy—which is a 6th grade text—and bumped it down to 4th grade. It seems as though they are just taking what was a higher grade curriculum and putting it down to a lower grade—and then they call that “rigor.”
My idea is to have our Guided Reading as the heart of our reading program here. Then, we will have accessible texts for all students on their instructional level. Because these CCSS texts are very frustrating for these students, so how are they progressing if they are frustrated?
13
PARTICIPANT QUOTES:Paula, Classroom Teacher, Grade 2, Virginia
As a beginning teacher last year, I felt totally overwhelmed. School let out at 2:00pm, but I never left before 6:00pm. Our school was on the “warned list” due to our performance and someone from outside [the district representative] was always coming in. I survived because of collaboration. There was this revolving door of new initiatives all the time. After the tutoring class, I knew I needed support from others who ‘understood’ and I reached out to the Title 1 Teacher & Reading Specialist.
For our curriculum, we implemented a balanced approach to literacy. We do whole group, small groups (guided reading & words study), word study and writing. I felt comfy with this format due to the approach we learned in clinic.
14
PARTICIPANT QUOTES:Miranda, Reading Specialist, Virginia
I decided to leave the classroom and move into a leadership role because of the mentoring and confidence I gained [from clinic]. We learned every type of assessment and intervention… Tutoring students helped me look at reading with diverse learners [Lengthy discussion naming different assessments learned]. It helps me to know how to do more rich assessments with students now.
I was always very interested in ESL students… our class allowed me to work one-on-one tutoring a student whose home language is Burmese and I also worked with a student with a disability. Both clinic experiences prepared me for how to work with small groups and prepared me for the variety of needs and challenges I face now as a Reading Specialist in this urban setting. After the clinic, I decided to go ahead and complete my ESL endorsement.
15
LONGITUDINAL REVIEW OF THE 30 CASES ACROSS 6 YEARS,
COLLECTIVE CASE STUDY APPROACH Approach:
1. Seek IRB approval2. Define design (Merriam (1988); Miles & Huberman (1994) & Barone (2004):
Particularistic: Cases are all focused on ‘Literacy Clinic’ transfer categories determined through
previously research* (instruction, assessment, coaching, leadership, technology). Descriptive: triangulation provides rich description of common trends Heuristic: continued study of existing data will enrich understanding of what transfers
from clinic across time Inductive: Data drives the understandings which are emerging
3. Examine existing “cases” data & findings to complete an in-depth analysis of trends (categories, coding). Seek cases containing information-rich informants.
4. Check for accuracy & misconceptions by consulting with key informants and/or researchers.
*Derived from interviews of 28 additional clinic graduates
16
SUMMARIZING ACROSS SITESDoes clinic make a difference compared to other coursework approaches? Collective review of clinic graduates report that the literacy clinic helped them in similar ways:
Student-centered, differentiated instruction More of a focus on strengths & needs Variety of “assessment practices” incorporated in the
classroom Collaboration & Sharing with others Coaching/leadership opportunities Working through disjuncture/policy changes/mandates
successfully or deciding to change paths Advocacy & implementation for research-based practices Developing deep and thoughtful beliefs about literacy More “masterful” teaching approach, habit of mind
(reflective practice, asking why?, seeking excellence
17
IN CONCLUSION:“[Clinic] really puts it all together when you focus on one student—use data and plan intervention that matches—a real eye opener. Now when I analyze data I can see the big picture…Writing the practicum report actually…showed how to triangulate the data, helped me write a more cohesive report, helped me look at individuals as well as the classroom. It helped me recognize trends, organize—prioritize instruction and made me more confident so when I write a report or meet with a teacher, I really know what I am talking about and can now explain it in “real words” and get down to what learning needs to take place.”
VIDEO PROTOCOLS FOR THE ASSESSMENT OF TEACHER
KNOWLEDGE AND SKILL IN LITERACY ASSESSMENT AND INSTRUCTION
Literacy Research Association Dallas, TX
December 4, 2013
TEAM MEMBERSResearcher InstitutionStephanie McAndrews
Southern Illinois University Edwardsville
Shadrack Msengi Southern Illinois University EdwardsvilleTammy Ryan Jacksonville UniversityNancy Stevens University of Wisconsin-WhitewaterLee Ann Tysseling Boise State University JoAnne Vazzano Northeastern Illinois University
PURPOSE/RATIONALE1. Develop an authentic assessment
to measure reading teacher/specialist candidates’ abilities in assessment and planning instruction
2. Provide a tool aligned to instructional pedagogy
3. Enhance multiple-choice format state certification exams
VIDEO FOR ASSESSING TEACHER KNOWLEDGE AND SKILLS: RESEARCH QUESTIONS1. How can literacy faculty develop a protocol
that can be used across the nation, when multiple forms and analyses of assessment are used?
2. How can we authentically assess teachers/specialist candidates’ ability to analyze student reading?
3. How can we develop a reliable and valid rubric as a tool to evaluate teacher knowledge and skill in assessing videos of a student’s reading?
EXPERIMENTAL DESIGN:FORMATIVE RESEARCH• Intended to improve instructional theories,
models, practices, and processes (Bradley & Reinking, 2011; Brown, A. 1992).
• Focuses on the characteristics of Reinking and Bradley’s (2007) formative design:
• Established educational goals based in theory • Implement an intervention to achieve goals
(protocol) • Collect data to identify factors enhancing or
inhibiting achieving goal • Modify intervention based on unanticipated
factors • Note how intervention changed • Determine positive and negative unanticipated
effects of the intervention (to be determined)
METHODOLOGY: VIDEO DEVELOPMENT
Video protocol Instructions1.Authentic text2.Introduce the passage 3.Student reads the passage4.Request oral retelling 5.Probe hesitant reteller6.No published assessment
text
HUNGRY ANIMALS TEXT
Reading Recovery Level 11Middle First Grade
GRADUATE STUDENT VIDEO RESPONSE
DECLARATION OF INDEPENDENCE: AMAZING DAYS OF ABBY HAYES
4.3 GELexile 510 Guided Reading Level Q.
METHODOLOGY VIDEO RESPONSE PROTOCOL
METHODOLOGYMixed Methods Qualitative Methods
● Rubric: Researcher Developed • Researcher “answer keys” completed with comparison of responses of all six independent responses
● Benchmarking: All six researchers used rubric on one set of teacher papers • Rubric adjusted to reflect experience in application to the teacher papers
• Rubric “tested” by partners on one set of papers
• Rubric again revised● Revision: Teams of two researchers used
rubric to score each teacher’s paper independently
● Discrepancies resolved
METHODOLOGY-CONTINUEDMixed Methods• Quantitative Methods
●Descriptive data presented by class sets and combined data set
●Correlations of subscores and totals in combined data set
• Data consolidation
PARTICIPANTS (CONTEXTS OF SITES)• Southeastern United States (1Case)
● Administered at beginning of the semester● Undergraduate students in senior year ● In ⅘ sequence of reading endorsed courses
• Central United States (2 Cases)● Administered at the beginning of the semester● Graduate students at the beginning of the program- some
with no knowledge of IRI’s and Running Records● Graduate students during first clinical course
• Western United States (4 Cases)● Administered at the beginning and end of the term (pre-
and post-assessments)● Undergraduate students in a Literacy Lab course● Had completed two other literacy courses including IRIs and
running records
CURRENT PROTOCOL RUBRICProficient-5 Near Proficient-4 Acceptable 3 Limited-2 Inadequate-1
Reading Level
Instructional level with valid, strongly supported appropriate reasons based on word accuracy and comprehension
Instructional level with appropriate reason for word accuracy and comprehension.
Instructional level with valid but minimum or vague reason for word accuracy or comprehension
Instructional with no reason
Level Independent or Frustration Level or Incorrect unsupportable reasons
Strengths Oral Reading-Including two or more of the following: word accuracy, self-correction, multiple cueing systems, self-monitoring with supportive explanationAND Comprehension- including two or more of the following: Literal meaning, background knowledge, some details, story sequence with supportive explanation
Oral Reading-Including one of the following: word accuracy, self-correction, multiple cueing systems, self-monitoring with supportive explanationAND Comprehension- including one of the following: Literal meaning, background knowledge, some details, retelling, story sequence, main idea with supportive explanation
Oral Reading andComprehension mentioned-or-Oral Reading or Comprehension with an appropriate example OR elaborated explanation
Oral Reading ORComprehension mentioned
No mention or incorrect statements of oral reading or comprehension
CURRENT PROTOCOL RUBRIC CONTINUED
Proficient-5 Near Proficient-4 Acceptable 3 Limited-2 Inadequate-1
Needs Oral Reading- Including two or more of the following: Slower pace, attention to phrasing, expression, more strategies for word identification and self-correction, multi-syllabic words with supportive explanationComprehension- Including Two of the following: Literal, Non-literal, inferential, or deeper meaning with supportive explanation
Oral Reading- Including one of the following: Slower pace, attention to phrasing, expression, more strategies for word identification and self-correction, multi-syllabic words with supportive explanationComprehension- Including one of the following: literal, Non-literal, inferential, or deeper meaning with supportive explanation
Oral Reading andComprehension mentioned-or-Oral Reading or Comprehension with an appropriate example OR elaborated explanation
Oral Reading ORComprehension mentioned
No mention or incorrect statements of oral reading or comprehension
InstructionalRecommend
.
Elaborated on two specific appropriate Oral Reading AND two specific appropriate comprehension strategies explicitly related to actual needs
A specific appropriate Oral Reading strategy AND a specific appropriate Comprehension strategy explicitly related to actual needs are described
An Oral Reading AND a Comprehension strategy related to actual needs is identified-OR-Oral Reading OR Comprehension strategy explicitly related to actual needs is described
An Oral Reading OR a Comprehension strategy related to actual needs is identified
No strategy mentioned or incorrect statements of oral reading fluency or comprehension strategy or not connected to actual needs.
DATA CONSOLIDATIONHISTOGRAM
N=106
4
5
6
7
8
9
10
11
12
13
14
15
16
0 5 10 15 20 25
Series1
DATA ANALYSIS-CASE 1SOUTHEASTERN UNITED STATESCandidate Reading
LevelStrengths Needs
RecommendTotal Score
9 1 1 1 2 515 3 1 1 1 61 1 3 2 1 72 1 3 2 1 78 1 3 2 1 7
10 1 3 2 1 73 5 3 8
14 1 3 1 3 85 3 3 2 1 9
11 1 2 3 3 912 3 2 2 2 96 3 3 3 1 10
13 2 4 3 1 1016 1 3 4 2 107 4 3 3 2 124 3 5 3 4 15
56789
101215
0 5
Histogram
Number of Teachers
Total Score
DATA ANALYSIS-CASE 2CENTRAL HA EARLY PROGRAM
7
8
9
10
11
0 3
Histogram
Number
Score
Cand-idate
ReadLevel
Strength Needs Recommend
Total Score
1 4 0 3 8
2 3 4 0 2 9
3 1 4 3 2 10
4 1 4 4 2 11
5 3 3 3 2 11
6 1 2 3 1 7
7 2 4 3 2 11
8 1 3 3 1 8
9 1 4 3 0 8
DATA ANALYSIS-CASE 3CENTRAL HA BEGINNING CLINICAL
8
9
10
11
12
0 4
Histogram
Number
Score
Cand-idate
Read Level
Strength Needs Recommend
Total Score
4 1 2 3 2 8
6 1 3 2 2 8
7 1 3 3 2 9
1 1 4 3 2 10
2 3 3 2 2 10
3 2 3 3 2 10
5 3 3 2 2 10
8 3 3 4 2 12
9 3 3 4 3 13
DATA ANALYSIS-CASE 4 WESTERN AH PRE
4
6
9
12
14
0 6
Histogram
Number
Score
Cand-idate
ReadLevel
Strength Needs Recommend
Total Score
21 1 1 1 1 423 1 1 1 1 416 1 2 1 1 51 1 2 2 1 69 3 1 1 1 68 1 1 3 3 8
22 1 4 2 1 83 1 3 3 2 9
13 1 3 3 2 915 1 3 3 2 94 1 3 3 3 10
10 3 3 3 1 1011 1 3 3 3 1017 4 3 2 1 1019 1 3 3 3 1020 1 3 3 3 102 5 1 3 3 12
18 3 3 3 3 125 4 4 4 1 137 3 4 3 3 13
14 4 3 3 3 1312 4 4 3 3 146 5 4 4 3 16
DATA ANALYSIS-CASE 5 WESTERN AH POSTCand-idate
ReadLevel
Strength Needs Recommnd
Total Score
6 1 1 3 2 7
8 1 2 3 2 8
9 3 1 3 1 8
16 1 2 3 2 8
3 4 2 2 1 9
10 1 3 3 2 9
19 1 4 2 2 9
18 4 1 3 2 10
2 1 1 4 5 11
14 3 3 2 3 11
1 3 3 3 3 12
20 5 4 3 2 14
5 4 4 4 4 1679
1114
0 3
Histogram
Number
Score
DATA ANALYSIS-CASE 6WESTERN HA PRE
45689
1012131416
0 6
Histogram
Number
Scores
DATA ANALYSIS-CASE 7 WESTERN AH POST
568
1011121314
0 3
Histogram
Number
Score
Candidate
Read Level
Strengths Needs Recommend
Total Score
1 1 1 2 1 5
2 2 3 3 3 11
3 1 1 3 1 6
6 4 1 3 4 12
8 2 4 3 1 10
9 3 3 1 1 8
14 1 3 3 1 8
16 4 3 2 4 13
18 4 3 2 3 12
19 4 2 3 3 12
20 4 1 5 4 14
23 3 2 3 3 11
CONCLUSIONS• Creating a reliable and useful rubric requires a
commitment of significant time● Inter-rater reliability● Trial applications to candidate responses● Multiple revisions
• Individual rubrics must be created for each video sample
• Current data resulted in “normal distributions” of scores
• Given this sort of development it may be possible to take this approach to assessment “to scale”● Latent semantic analysis could be used for initial
machine scoring● Readings of individual protocols may be necessary
NEXT STEPS• Performance task of this nature is very
promising• Capturing good quality videos requires
careful planning and recording• Collect 4 new videos with revised protocol
(primary, intermediate, middle and high school)
• Develop specific rubrics for each• Administer it as a pre-test and post test for
undergraduate and graduate students
REFERENCESBradley, B., & Reinking, D. (2011). Revisiting the connection between
research and practice using formative and design experiments. In N. Duke & M. Mallette (Eds.), Literacy research methodologies handbook (2nd ed.). (pp. 188-212). New York, NY: The Guildford Press.
Brown, A. L. (1992). Design Experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141--178.
Creswell, J. W. & Plano Clark, V. (2011). Designing and conducting mixed methods research, 3rd Ed. Thousand Oaks, CA: Sage.
Greene, J., Caracelli, V., & Graham, W. (1989). Toward a conceptual framework for mixed-methods evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274.
Johnson, R. B. & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26.
Onwuegbuzie, A. & Mallette, M. (2011). Mixed research in literacy research. In N. Duke & M. Mallette, (Eds.). Literacy research methodologies handbook, (2nd ed.). New York, NY: The Guildford Press.
Reinking, D., & Bradley, B. A. (2008). On formative and design experiments: Approaches to language and literacy research. New York: Teachers College Press.
VIDEOTAPED ANALYSIS FOR TEACHER REFLECTION
Literacy Research Association Dallas, TX
December 4, 2013
PARTICIPANTS Terry Deeney, University of Rhode Island Cheryl Dozier, University at Albany Zubeyir Coban, University at Albany Barb Laster, Towson University Jeanne Cobb, Coastal Carolina University Marcie Ellerbe, Coastal Carolina University Debbie Gurvitz, National Louis University Anne McGill-Franzen, University of Tennessee Natalia Ward, University of Tennessee Jennifer Lubke, University of Tennessee Mary McVee, University at Buffalo Ashlee Ebert Campbell, University at Buffalo Liz Tynen, University at Buffalo Erica Bowers, University of California at Fullerton
THEORETICAL FRAMING Dewey (1933) reflection is “active, persistent,
and careful consideration of belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it ends” (p. 9)
Schon (1983) analyzing and acting purposefully on a situation with the goal of changing it; developmental process Reflection on action Reflection in action Reciprocal reflection in action
• Renewed interest in using video as a reflective tool (Grossman, 2005)
VIDEO REFLECTION AS A TOOL FOR IMPROVING TEACHING Research suggests video reflection can
be an effective strategy to help teachers improve their teaching (Penny & Coe, 2004; Tripp & Rich, 2012)
In our transfer/transformation study (Deeney et al, 2011) clinic participants named video reflection as a powerful tool for improving their practice
Studies vary on ways in which reflection was structured [Tripp & Rich, 2012(a); 2012 (b)]
Dimensions of Video Analysis (Tripp & Rich, 2012, p. 681)
PURPOSE/RATIONALE Systematic look at how videotaped reflection
tasks are structured in clinics/labs across the country
Analysis of collegial and instructor feedback across the country
A beginning analysis of how clinic participants respond to video reflection tasks
RESEARCH QUESTIONS In what ways do clinics across the country
have teachers use video to reflect on their own teaching?
What practices do clinics/labs across the country use to facilitate reflection on videos of one’s own teaching?
How has video-facilitated reflection affected teachers’ understanding of their own teaching?
METHODOLOGYSeven sites in six states New York (2 sites) South Carolina (1 site, 2 instructors) Tennessee Rhode Island Maryland Illinois
DATA COLLECTION Sites posted data to a common Google site
Video reflection assignment/task directions Video reflection rubrics or scoring guidesSome sites also posted student responses to video reflection assignment
DATA ANALYSIS Initial qualitative analysis of assignment tasks
across sites to identify commonalities and differences What did clinic/lab reflection tasks ask students to do? What format did reflection take? What kind of feedback/collaboration was involved?
Compared to dimensions of reflection identified in Tripp & Rich (2012) Posted analysis to Google site Site participants reviewed and revised as needed
Identified additional information needed for subsequent analysis Participants provided information on Google site
DIMENSIONS OF CLINIC/LAB VIDEO REFLECTION PROCESS
Dimension Questions AnalyzedReflection Tasks How do clinics/labs structure video
reflection tasks?Guiding Reflection What types of frameworks do
clinics/labs use to guide teacher reflection?
Individual/Collaborative Reflection
How do clinic/labs facilitate reflection?
Video Length What portion of the teaching session is videotaped/analyzed by teachers?
Number of Reflections How many video reflections do teachers complete?
Instructor Feedback How do clinic/labs engage others in individual teacher reflections?
DIMENSIONS OF REFLECTION TASKS
Written reflection format for all
Low level of pre-preparation High level
Teacher choice of what to record
Instructor choice
No transcription Focused Transcription
No formal assessment Scoring Guides Rubric
How do clinics/labs structure video reflection tasks?
DIMENSIONS OF GUIDING REFLECTIONWhat types of analysis do clinics/labs use to
guide teacher reflection?
Less structured More structured
Analysis includedLanguageEngagementScaffoldingMaterials used/chosenResponsivenessTime
GUIDING QUESTIONS/PROMPTS USED ACROSS SITES Analyzing language: What types of questions do I ask (open ended,
closed)? Does my language/Do my prompts encourage student independence? What is going well? How do I know (evidence)? What would I change/do
differently? Why? What do I need to think about in the future? How much time is spend reading? writing? on word study? What types of prompts do I use primarily? Am I over-scaffolding? Who is doing the talking? How do I honor partially correct responses? Am I encouraging critical thinking? Is the text appropriate/at the appropriate level? Talk about the level of student engagement/teacher engagement What did I learn about wait time? Why was this instruction needed? What explicit feedback am I providing? What I learn from my language choices? Talk about the productiveness of your responses and your decision
making
VIDEO LENGTHWhat portion of the teaching session is videotaped
and analyzed by teachers?
Short segment Whole Session Portions of Multiple Sessions
Single
refle
ction
of sin
gle se
gmen
t
multipl
e refl
ectio
ns of
single
segm
ent
multipl
e refl
ection
s of m
ultipl
e seg
ments
NUMBER OF REFLECTIONS How many video reflections do teachers
complete?
1 6
INDIVIDUAL/COLLABORATIVE REFLECTION
In what ways do clinics/labs facilitate reflection?Occasional Always
One colleague several colleagues whole class
In class/person Online (e.g. Google community)
Oral conversations Written conversations Oral/written/online
No viewing guide Student-constructed viewing guide
Instructor not part of collaborative reflection Instructor facilitates
FACILITATING COLLEAGUE FEEDBACK Occasional Always
Guiding questions/prompts across sites Three things you noticed about your colleague’s
teaching/interactions/positive feedback Three things your colleague could have done
differently/suggestions on ways to improve practice/constructive criticism
How your colleague’s instruction connects to your own student
How your colleague’s instruction connects to research/course readings
Prompts/Language your colleague used Where should this lesson go from here?
INSTRUCTOR FEEDBACK/RESPONSES
Do instructors provide feedback?
Never Sometimes Always
What types of feedback/responses are given by instructors?
Oral Written Both
Feedback on teaching or reflection Feedback on teaching and reflection
CONCLUSIONSThrough video reflection, teachers were asked to
slow down the process of their teaching and provide written and/or oral feedback of
their own teaching practices colleagues’ teaching student engagement
TEACHER RESPONSES TO VIDEOTAPED TASKS
Through videotaped reflections and viewings, teachers
Described multiple ways to approach teaching and learning (What went well? What would you do differently? Why?)
Named instructional practices and language choices of colleagues
Analyzed the intersections of teacher and student engagement
Identified practices for transfer to classroom or tutoring sessions
NEXT STEPS
Does/In what ways does transcription improve the quality of teacher reflection? How much is too much?
What are the essential elements in reflective practice? (tensions)
Considering contexts online/on campus short summer sessions, full semester placement across the program/clinic transfer
Reflecting on instructional practices collegial reflection (PLCs)
VIDEOTAPED ANALYSIS FOR TEACHER REFLECTION
Literacy Research Association Dallas, TX
December 4, 2013
PARTICIPANTS Terry Deeney, University of Rhode Island Cheryl Dozier, University at Albany Zubeyir Coban, University at Albany Barb Laster, Towson University Jeanne Cobb, Coastal Carolina University Marcie Ellerbe, Coastal Carolina University Debbie Gurvitz, National Louis University Anne McGill-Franzen, University of Tennessee Natalia Ward, University of Tennessee Jennifer Lubke, University of Tennessee Mary McVee, University at Buffalo Ashlee Ebert Campbell, University at Buffalo Liz Tynen, University at Buffalo Erica Bowers, University of California at Fullerton
THEORETICAL FRAMING Dewey (1933) reflection is “active, persistent,
and careful consideration of belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it ends” (p. 9)
Schon (1983) analyzing and acting purposefully on a situation with the goal of changing it; developmental process Reflection on action Reflection in action Reciprocal reflection in action
• Renewed interest in using video as a reflective tool (Grossman, 2005)
VIDEO REFLECTION AS A TOOL FOR IMPROVING TEACHING Research suggests video reflection can
be an effective strategy to help teachers improve their teaching (Penny & Coe, 2004; Tripp & Rich, 2012)
In our transfer/transformation study (Deeney et al, 2011) clinic participants named video reflection as a powerful tool for improving their practice
Studies vary on ways in which reflection was structured [Tripp & Rich, 2012(a); 2012 (b)]
Dimensions of Video Analysis (Tripp & Rich, 2012, p. 681)
PURPOSE/RATIONALE Systematic look at how videotaped reflection
tasks are structured in clinics/labs across the country
Analysis of collegial and instructor feedback across the country
A beginning analysis of how clinic participants respond to video reflection tasks
RESEARCH QUESTIONS In what ways do clinics across the country
have teachers use video to reflect on their own teaching?
What practices do clinics/labs across the country use to facilitate reflection on videos of one’s own teaching?
How has video-facilitated reflection affected teachers’ understanding of their own teaching?
METHODOLOGYSeven sites in six states New York (2 sites) South Carolina (1 site, 2 instructors) Tennessee Rhode Island Maryland Illinois
DATA COLLECTION Sites posted data to a common Google site
Video reflection assignment/task directions Video reflection rubrics or scoring guidesSome sites also posted student responses to video reflection assignment
DATA ANALYSIS Initial qualitative analysis of assignment tasks
across sites to identify commonalities and differences What did clinic/lab reflection tasks ask students to do? What format did reflection take? What kind of feedback/collaboration was involved?
Compared to dimensions of reflection identified in Tripp & Rich (2012) Posted analysis to Google site Site participants reviewed and revised as needed
Identified additional information needed for subsequent analysis Participants provided information on Google site
DIMENSIONS OF CLINIC/LAB VIDEO REFLECTION PROCESS
Dimension Questions AnalyzedReflection Tasks How do clinics/labs structure video
reflection tasks?Guiding Reflection What types of frameworks do
clinics/labs use to guide teacher reflection?
Individual/Collaborative Reflection
How do clinic/labs facilitate reflection?
Video Length What portion of the teaching session is videotaped/analyzed by teachers?
Number of Reflections How many video reflections do teachers complete?
Instructor Feedback How do clinic/labs engage others in individual teacher reflections?
DIMENSIONS OF REFLECTION TASKS
Written reflection format for all
Low level of pre-preparation High level
Teacher choice of what to record
Instructor choice
No transcription Focused Transcription
No formal assessment Scoring Guides Rubric
How do clinics/labs structure video reflection tasks?
DIMENSIONS OF GUIDING REFLECTIONWhat types of analysis do clinics/labs use to
guide teacher reflection?
Less structured More structured
Analysis includedLanguageEngagementScaffoldingMaterials used/chosenResponsivenessTime
GUIDING QUESTIONS/PROMPTS USED ACROSS SITES Analyzing language: What types of questions do I ask (open ended,
closed)? Does my language/Do my prompts encourage student independence? What is going well? How do I know (evidence)? What would I change/do
differently? Why? What do I need to think about in the future? How much time is spend reading? writing? on word study? What types of prompts do I use primarily? Am I over-scaffolding? Who is doing the talking? How do I honor partially correct responses? Am I encouraging critical thinking? Is the text appropriate/at the appropriate level? Talk about the level of student engagement/teacher engagement What did I learn about wait time? Why was this instruction needed? What explicit feedback am I providing? What I learn from my language choices? Talk about the productiveness of your responses and your decision
making
VIDEO LENGTHWhat portion of the teaching session is videotaped
and analyzed by teachers?
Short segment Whole Session Portions of Multiple Sessions
Single
refle
ction
of sin
gle se
gmen
t
multipl
e refl
ectio
ns of
single
segm
ent
multipl
e refl
ection
s of m
ultipl
e seg
ments
NUMBER OF REFLECTIONS How many video reflections do teachers
complete?
1 6
INDIVIDUAL/COLLABORATIVE REFLECTION
In what ways do clinics/labs facilitate reflection?Occasional Always
One colleague several colleagues whole class
In class/person Online (e.g. Google community)
Oral conversations Written conversations Oral/written/online
No viewing guide Student-constructed viewing guide
Instructor not part of collaborative reflection Instructor facilitates
FACILITATING COLLEAGUE FEEDBACK Occasional Always
Guiding questions/prompts across sites Three things you noticed about your colleague’s
teaching/interactions/positive feedback Three things your colleague could have done
differently/suggestions on ways to improve practice/constructive criticism
How your colleague’s instruction connects to your own student
How your colleague’s instruction connects to research/course readings
Prompts/Language your colleague used Where should this lesson go from here?
INSTRUCTOR FEEDBACK/RESPONSES
Do instructors provide feedback?
Never Sometimes Always
What types of feedback/responses are given by instructors?
Oral Written Both
Feedback on teaching or reflection Feedback on teaching and reflection
CONCLUSIONSThrough video reflection, teachers were asked to
slow down the process of their teaching and provide written and/or oral feedback of
their own teaching practices colleagues’ teaching student engagement
TEACHER RESPONSES TO VIDEOTAPED TASKS
Through videotaped reflections and viewings, teachers
Described multiple ways to approach teaching and learning (What went well? What would you do differently? Why?)
Named instructional practices and language choices of colleagues
Analyzed the intersections of teacher and student engagement
Identified practices for transfer to classroom or tutoring sessions
NEXT STEPS
Does/In what ways does transcription improve the quality of teacher reflection? How much is too much?
What are the essential elements in reflective practice? (tensions)
Considering contexts online/on campus short summer sessions, full semester placement across the program/clinic transfer
Reflecting on instructional practices collegial reflection (PLCs)
USE OF IPADS IN READING CLINICS
Literacy Research Association Dallas, TX
December 4, 2013
RESEARCHERS Judith Wilson, University of Nebraska-Lincoln (NE) Guy Trainer, University of Nebraska-Lincoln (NE) Lee Ann Tysseling, Boise State University (ID) Melissa Stinnett, Western Illinois University (IL) Gilda Martinez-Alba, Towson University (MD) B. P. Laster, Towson University (MD) Shelly Huggins, Towson University (MD) Margie Curwen, Chapman University (CA) Todd Cherner, Coastal Carolina University (SC) Mary Applegate, St. Joseph’s College (PA)
PURPOSE/RATIONALE
To explore how teachers/tutors—after given training-- use iPads in multiple reading clinics across the nation. Understand whether iPads are used for drill-and-practice activities or for more powerful instructional activities.
RESEARCH QUESTIONS
1. How do teachers/tutors in Reading Clinic transform their practice by using iPads?
2. What is the impact of training about iPad applications on teacher use?
3. In what ways do teachers and students in reading clinics use iPads?
RESEARCH DESIGN: MIXED METHODS Similar data collection across sites in five
states: University of Nebraska (2 different clinics)Towson (Maryland)Boise State (Idaho)Western IllinoisCoastal Carolina (South Carolina)
Identical pre-& post survey of teachers/tutors; some observations; collected artifacts (e.g., client reports; sampling during clinic)
DATA ANALYSIS: MIXED METHODS
Quantitative Analysis of Pre- & Post SurveyAnswers given numerical valuesMeans calculatedANOVA for determining GROWTH
ES=Effect SizeExcel sheets that generated graphs
for analyzing specific advantages or disadvantages of using iPads
Tallies of which apps used
DATA ANALYSIS: MIXED METHODS Qualitative Data Analysis of uses/purposes Coded & analyzed all observations, artifacts, and
reports of apps used data in phases: constant comparative method (Glaser & Strauss, 1967; Strauss & Corbin, 1990). Phase #1 Site researchers independently coded
purpose of iPad use of their own dataPhase #2: Shared across researchers using Google
Site. Re-coded by non-site researchers in categories of use and emerging themes—and checked with site directors.
Phase #3: Several researchers focused on macro-levels of CATEGORIES OF TECHNOLOGY USE, novice-established use, other emerging themes.
Cross-checking (inter-rater) at every stage.
FINDINGS: THE IPAD STORY ACROSS SITES Across five Clinics, 51 recommended apps and other
uses of the iPads were formally presented.
Although some professors were using the iPads for the first time themselves, professors carefully thought through apps to model, as well as created themes.
At all Clinics… Critical examination of apps via rubrics, charts, online
review sites, and/or through discussion
A Rubric to Evaluate Apps: http://kathyschrock.net/pdf/ipad_app_rubric.pdf
Places to Research Apps: http://www.appitic.com/ http://www.uen.org/apps4edu/
Searchable App Review Site developed by Todd Cherner
appedreview.org
THE IMPACT OF TRAINING ON THE USE OF APPSCoastal Carolina – An “App”etizers activity used every other
week to discuss pros/cons of different apps, uses in the classroom; the professor used rubrics to help teachers critique apps.
Towson – Presentations by Advanced Clinicians – six key apps & other uses of the iPads at the beginning of Clinic. Strong correlation between the apps presented & the apps used. Teachers used the iPad for assessment, motivation, instruction.
Western Illinois – An Apple representative – provided a demonstration of different apps on week six.
University of Nebraska - iPads themes – such as phonemic awareness, decoding/spelling, books, vocabulary, etc. were created. Teachers were encouraged to use the iPads during their warm up and cool down. Teachers filled out a chart to provide the name of an app reviewed, its description, and how it could be used for instruction.
Overall – Presenters provided a formalized way to help teachers think critically about apps and their uses for instruction.
HOW THE IPADS WERE USED Apps were modeled by the professors or advanced
clinicians; key apps were used at each site A total of 259 apps used by 53 teachers across sites
Teachers/tutors used apps with their clients, and showcased them to each other
Teachers who already had iPads were comfortable using them, especially if they had experience with iPads within their school setting
iPads were used with clients for skill and drill, reward/play, assessment, and for reading and writing instruction
THE STORY: SOME CONCLUSIONS Teachers needed to be willing to take risks, and try apps they
were not familiar with
• Those who were willing to take risks were motivated to continue trying more apps
A few professors commented that they would like to learn about more apps to demonstrate how to match an app with instruction needed
Overall – the professors noted motivation related to using apps for instruction, both by the teachers and the clients; and, they are pleased to continue building their knowledge base about existing and upcoming apps to provide meaningful instruction
QUESTION #1DISPOSITIONS TOWARDS IPAD USE
IPAD USE OVERALL
Coastal Carolina
UNL Towson Western Illinois
Boise1
1.5
2
2.5
3
3.5
4 PrePost
4=I love it, 3= I like it, 2= It’s OK I guess
N=19ES=.4
N=21ES=.9
N=17 ES=.3
N=9ES=1.0
N=10ES=.-.2
IPAD PERSONAL USE
UNL Towson Western Illinois1
1.5
2
2.5
3
3.5
4 PrePost
4=I love it, 3= I like it, 2= It’s OK I guess
N=19ES=.3
N=21ES=.8
N=17ES=.1
IPAD USE PROFESSIONALLY
UNL Towson Western Illinois1
1.5
2
2.5
3
3.5
4PrePost
4=I love it, 3= I like it, 2= It’s OK I guess
N=19ES=.5
N=21ES=1.2
N=17ES=.2
IPAD USE IN PRACTICUM
UNL Towson Western Illinois1
1.5
2
2.5
3
3.5
4 PrePost
4=I love it, 3= I like it, 2= It’s OK I guess
N=19ES=.4
N=21ES=1.3
N=17ES=.3
FINDINGS: PRE- AND POST SURVEYSQUESTION #1
How confident are you about using the iPad use?
ILLINOIS-for PERSONAL USE
4/15 (27%) felt More confident using the iPad by the end of the study.
2/15 (15%) felt LESS confident using the iPad
10/15 (67%) felt the SAME Amount of high confidence throughout the study
1/15 (7%) felt the same amount of low confidence throughout the study.
FINDINGS: PRE- AND POST SURVEYSQUESTION #1
How confident are you about using the iPad?
ILLINOIS-for Professional use?
3/15 (20%) felt More confident using the iPad by the end of the study.
1/15 (7%) felt LESS confident using the iPad
10/15 (67%) felt the SAME Amount of high confidence throughout the study
2/15 (13%) felt the same amount of low confidence throughout the study.
1/15 (7%) felt this question is not applicable.
FINDINGS: PRE- AND POST SURVEYSQUESTION #1
How confident are you about using the iPad?
ILLINOIS-for Practicum use?
6/15 (40%) felt More confident using the iPad by the end of the study.
1/15 (7%) felt LESS confident using the iPad
8/15 (53%) felt the SAME Amount of high confidence throughout the study
2/15 (15%) felt the same amount of low confidence throughout the study.
QUESTION #2: HIGH POINTS/ADVANTAGES
QUESTIONS #2 & #3: ADVANTAGES/DISADVANTAGES
Dramatic increases from pre to post in the recognition of the pedagogical potential of the iPAD
the potential for motivating and engaging students.
Decreases in negative observations about the familiarity with the physical features of the iPAD
Even in the positive observations, it seemed that there was less attention paid to the iPAD as a physical tool and greater attention to its broader uses.
Q 2: HIGH POINTS COMBINED SITES
OP = Operation, Mechanics, Physical Features/PositiveNA = Not applicablePU = Pedagogical Uses or ApplicationsPA = Personal Uses or ApplicationsM = User Motivation or EngagementPR = Professional Uses (non-pedagogical)
N=57 Pre
OPNAPUPAMPR
N=57 Post
OPNAPUPAMPR
Q2: HIGH POINTS—COMBINED SITES
OP = Operation, Mechanics, Physical Features/PositiveNA = Not applicablePU = Pedagogical Uses or ApplicationsPA = Personal Uses or ApplicationsM = User Motivation or EngagementPR = Professional Uses (non-pedagogical)
OP NA PU PA M PR0
10
20
30
40
50
60
N=57 PostN=57 Pre
Q 2: HIGH POINTS—WESTERN ILLINOISPRE- POST
OP = Operation, Mechanics, Physical Features/Positive
NA = Not applicablePU = Pedagogical Uses or ApplicationsPA = Personal Uses or ApplicationsM = User Motivation or EngagementPR = Professional Uses (non-pedagogical)
OP NA PU PA M PR0
2
4
6
8
10
12
N=17 PreN=17 Post
Q 2: HIGH POINTS—TOWSON PRE- POST
OP = Operation, Mechanics, Physical Features/Positive
NA = Not applicablePU = Pedagogical Uses or ApplicationsPA = Personal Uses or ApplicationsM = User Motivation or EngagementPR = Professional Uses (non-pedagogical)
OP NA PU PA M PR0
2
4
6
8
10
12
14
16
N=21 PreN=21 Post
Q 2: HIGH POINTS—COASTAL CAROLINAPRE- POST
OP = Operation, Mechanics, Physical Features/Positive
NA = Not applicablePU = Pedagogical Uses or ApplicationsPA = Personal Uses or ApplicationsM = User Motivation or EngagementPR = Professional Uses (non-pedagogical)
OP NA PU PA M PR0
1
2
3
4
5
6
7
N = 9 PreN = 9 Post
Q 2: HIGH POINTS—BOISE STATE PRE- POST
OP NA PU PA M PR0
1
2
3
4
5
6
7
N=10 PreN=10 Post
OP = Operation, Mechanics, Physical Features/Positive
NA = Not applicablePU = Pedagogical Uses or ApplicationsPA = Personal Uses or ApplicationsM = User Motivation or EngagementPR = Professional Uses (non-pedagogical)
QUESTION #3: LOW POINTS/DISADVANTAGES
FINDINGS Q3? WHAT ARE THE LOW POINTS IN USING IPADS?
LOW POINTS BY CATEGORY
Device
Challe
nges
Apps
Instru
ction
Apps
Time
0%
20%
40%
60%
80%
100%
PrePost
QUESTION #4 USES OF THE IPAD
ACROSS SITES QUESTION #4: WHAT APPLICATIONS/WEBSITES/PROGRAMS HAVE YOU USED ON THE IPAD? PRE-TUTORING RESPONSES
76 teachers across five locations reported prior use of 126 distinct apps/websites/programs on the iPad
The application/websites/programs were cited as “used” by a teacher 190 times on the survey
The overall mean of apps used per teacher was 2.5; means by location were UNL (1.1), Boise State (1.4), Towson (1.76), Western Illinois (3.82), and Coastal Carolina (5.59)
The most common response was “no experience with the iPad” from 24 teachers.
The next most common responses were: “games” (10), Educreation (10), Facebook (8), Safari (8), email (6), Dictionary (6), iTunes (5), Edmodo (5), Pinterest (4), Popplet (4), Who Am I? (4)
Of those responses, Educreation, Popplet, and Who Am I? would most likely be used with students
ACROSS SITES QUESTION #4: WHAT APPLICATIONS/WEBSITES/PROGRAMS HAVE YOU USED ON THE IPAD? POST-TUTORING RESPONSES
76 teachers across five location reported use of 229 distinct apps/websites/programs on the iPad
The applications/websites/programs were cited as “used” by a teacher 447 times on the survey
The overall mean of apps used per teacher was 5.88; means by location were Boise State (2.4), Western Illinois (5.18), Towson (5.24), UNL (6.47), and Coastal Carolina (11.33)
The most common responses were: Opposites (23), AudioNote (21), Dictionary (20), iCard Sort (17), Kid Doodle (15), Popplet (15), Who Am I? (11), Chicktionary (10), Educreation (10), Quizlet (8), Camera (7), Edmodo (7), Safari (7), Clock/timer (7)
Teacher descriptions of use show that all of these apps and tools were used with students
ACROSS SITES QUESTION #4: WHAT APPLICATIONS/WEBSITES/PROGRAMS HAVE YOU USED ON THE IPAD? POST-TUTORING RESPONSES Apps use identified by faculty categorized into Writing,
Spelling, Phonics, Reading, Vocabulary, Fluency, & Teacher Support
Writing: 4 writing pads (as, Kid Doodle), 2 multipage electronic books (Educreation, Story Bird), 4 support tools (Notes, Google Images, Dragon Dictation, Trading Cards)
Spelling: 1 word sorting (iCard Sort), 6 spelling games (as, Chicktionary, Word Zombie)
Phonics: 6 games and/or practice apps (as, ABC Magic, Phonics Awareness, Phonics TicTacToe)
Reading: 6 writing pads or multipage apps used for comprehension activities/reader’s response (Popplet, Educreation, Trading Cards, Graphic Organizer, Idea Sketch, Quizlet); 3 support tools for comprehension (Safari, Google Images, Camera for videotaping story parts); 1 Inferencing app (Who Am I?), 1 reading material app (Storia)
ACROSS SITES QUESTION #4: WHAT APPLICATIONS/WEBSITES/PROGRAMS HAVE YOU USED ON THE IPAD? POST-TUTORING RESPONSES (CONT’D)
Vocabulary: 5 engaging with words apps (Opposites, Mad Libs, Word to Word, iCard Sort, word Dynamo), 2 reference tools (Dictionary, Visual Thesaurus), 1 multipage app used as a vocabulary notebook (Moleskin Virtual Notebook)
Fluency: 3 audio recorders (AudioNote, Dragon Dictation, QuickVoice), 1 timer (Clock/timer)
Teacher Support: Edmodo, Drop Box, Power Teacher, 3 Ring, Class Dojo, Facebook
ACROSS SITES QUESTION #4: WHAT APPLICATIONS/WEBSITES/PROGRAMS HAVE YOU USED ON THE IPAD? COMPARING PRE AND POST-TUTORING RESPONSES
The teachers gained knowledge of 103 new apps.
Teachers gained experience in using apps at more than twice the pre-tutoring level (prior use mean of 2.5 apps per teacher, post-tutoring mean of 5.88 apps per teacher)
Teachers shifted from use of primarily personal apps (Facebook) to educational apps (Opposites, AudioNote, Dictionary, iCard Sort)
Teachers used apps across all parts of the curriculum (and, in some cases, for assessment) and used apps flexibly in different instructional contexts.
MACRO-ANALYSIS: HOW TEACHERS USE THE IPAD
APPS: TO WHAT PURPOSEPrevious categories: Technologies for literacy assessment Technologies to replace the instructor Technology to support teacher-directed
instruction Assistive technology Student-directed independent uses of
technology Possibilities and challenges of the emerging
new literaciesDubert, L & Laster, B. (2011) “Technology in Practice: Educators Trained in Reading Clinics/Literacy Labs. Journal of Reading Education 36(2).
LIMITATIONS Self-report data Difficult to determine extent of use Teachers may have “mis-named” apps Implementation may vary widely
Teacher/student use not determined Innovative uses possible (e.g. Educreations)
Context may have limited implementation or type of use Some preloaded apps Instructor modeling/emphasis Availability if iPad (take home, own, stay in Lab/Clinic)
Survey questions may not have been completely “clear” (reliable) e.g. instructor knows teachers used apps that were not
mentioned (DropBox, BlackBoard, Video Camera)
DATA (324 APPS)Number of Instances
Type of App Example
101 Drill and Practice Spelling, Sparklefish, Opposites72 Writing/Creating
ContentPages, StoryBird, EverNote, TradingCards, Haiku Deck
44 Teacher Utility Educreation, Timers, Edmodo, recorder, Nearpod, Class DoJo, DropBox
25 Other Pinterest, Facebook, Twitter. Bing/Google Searches
22 Story/Book Reading Storia, Toy Story, Curious George, Farfaria, Bible, Books, Pearson e-text
17 Multiple Suite PBS Kids, Reading A-Z17 Draw/play DrawSoFree, KidDoodle16 Other Reading
ActivityMadLibs, Who Am I?,
13 Video/audio Video Star, YouTube, Camera/Video Recording
11 “Mindless” Games TicTacToe, TempleRun, SubWay Surfer12 Reference Dictionary, Visual Thesaurus6 Graphics/Images Camera
MACRO-LEVEL CONCLUSIONS Drill and practice dominates
May be an “easy” out or reflect cultural practices Making progress in “Agentive” uses (Student-
directed Independent Uses of Technology and Possibilities and Challenges of Emerging Technologies) Content Creation/Video/Audio Research/writing activities
Technologies to “Replace the Instructor” still popular Multiple Sites
Much less evidence of assessment apps
FINAL CONCLUSIONS: IPADS IN SIX CLINICS
SUMMARIZING ACROSS SITES Sites vary in quantity & quality of use of
iPads, especially as they progress from novice to more established approaches to using this technology.
A correlation between initial presentation of the uses of iPads and how the teachers/tutors used them.
Teachers who were willing to take risks were motivated to continue trying more apps.
Even without connectivity, iPads had helpful pedagogial uses.
We see clear progress in “agentive” uses, although drill-and-practice uses stills dominates.
SUMMARIZING ACROSS SITES
Teachers in Reading Clinic gained knowledge of more than 100
new apps. gained experience in using apps at
more than twice the pre-tutoring level. shifted from use of primarily personal
apps to educational apps. used apps flexibly in different
instructional contexts.