View
213
Download
0
Category
Preview:
DESCRIPTION
They “took a Shot” at assessment! Spring 2015 Tech Literacy Participants Valerie Akuna Becky Baranowski Heather Muns Pete Turner Roselyn Tuner Pearl Williams
Citation preview
ASSESSMENT HAPPENS!AGENDA for August 20, 2015 Spring 2015 SAAC Members Shout Out! They “Took a Shot” at Assessment ( participants in Spring 2015
Tech Literacy Assessment) Cats of the Month Share Out Differentiated CATS Workshop General Education Abilities Assessments
Critical Inquiry Results Quantitative Literacy Recruitment
Feedback Workshop Assessment Jeopardy! Lunch
SAAC committee 2014-2015 shout out! Valerie Akuna Becky Baranowski Jennifer Brown Kimberly Graber Kathleen Iudicello Kanina Kempton
McDonald Terry Meyer
Heather Muns Bronwen Steele Pete Turner Roselyn Turner Jim Waugh Rene’ Willekins
They “took a Shot” at assessment!
Spring 2015 Tech Literacy Participants Valerie Akuna Becky Baranowski Heather Muns Pete Turner Roselyn Tuner Pearl Williams
CATS Of The Month share out Becky Baranowski, May 2015 (Bringing Labs into
Differentiated Equations) Chuck Bell, April 2015 (Will Increasing the Number of
Practical Examples of a Difficult Science Concept Improve Understanding?)
Norma Hernandez, March 2015 (No, You Can’t Use Your Book for the Quiz: Comparing Quiz Scores With and Without Class Materials)
Cecilia Rosales, February 2015 (Formative Assessment & Adaptive Teaching with Kahoot!)
Rachel Smith, January 2015 (Does Incorporating a Kinesthetic Assignment Improve Student Learning of a Complex Topic?)
CATS Workshop Time
Please self-select into one of two groups CATS Novices (Southside CTL)
Learn about CATS through a brief introductionFocus on reviewing, rating, and commenting
on existing CATS forms Experienced CATS Practitioners (Northside
CTL)Discuss ideas for new CATS forms with othersWork on creating new or following up on
existing CATS forms
Gen Ed Abilities – Critical Inquiry Results
Shout out to Jim Waugh and OPIE!
Critical Inquiry Components
Observation / Question Identifying Hypothesis /
ExplanationPlanningAnalysisConclusion / Solution
Overall Performance for 2014 The strongest area = Observation / Question:
Student can identify a relevant, testable question prompted by observation or scenario with no issues or errors
The area with the greatest opportunity to improve = Analysis: Student uses appropriate tools or methods to thoroughly examine the data with no issues or errors.
Although there were minor differences between the averages for new freshmen and sophomores, NONE of these differences were statistically significant
Overall Performance for 2014 Using 4 Point Scale
2014 Results
Excelling Proficient Approaching Proficient
Below Proficient
Mean
# % # % # % # %
Observation / Question 111 41% 126 46% 25 9% 11 4% 3.23
Identifying Hypothesis/ Explanation
98 36% 131 48% 29 11% 15 5% 3.14
Planning 95 35% 121 44% 44 16% 13 5% 3.09
Analysis 92 34% 114 42% 50 18% 17 6% 3.03
Conclusion / Solution 87 32% 133 49% 38 14% 15 5% 3.07
Critical Inquiry: Limitations
Instrument not tested for inter-rater reliability; or the extent to which two or more individuals (coders or raters) agree.
Variability in the ratings among faculty members may exist.
With statistical significance when comparing groups, unclear: Whether this is due to inter-rater reliability
or not
Lessons Learned Differences in the classes assessed may have
compromised the ability to credibly report longitudinal results. The next Critical Inquiry assessment may benefit from a more homogenous sample in 2017.
From Classroom Conversation (4/15) Focus on Analysis Incentivize repeat participants More homogeneous sample comparisons SAAC do more recruiting for participation SAAC/CATS as best assessment practices repository Address interrater reliability
Quantitative Reasoning assessment
Quantitative Reasoning Assessment
Fall 2015 Choose one of your existing assignments or projects Or create a new one that:
meets your course competencies, requires a deliverable product from your students, and can be assessed according to the rubrics students earn a grade for (“skin in the game”)
Timeline of QR Participation
August: Indicate your intent to participate September: Forms, guidelines, directions, etc. distributed
to participants September – November: Implement the assessment October – December: email electronic results to OPIE
QR Suggestions
QR Rubric
Any Questions about QR?
Answer this (from an ASCD survey of over 1200 teachers):
To encourage a growth mindset in students, what should be done by educators?A. Use less summative testing; use formative testing
instead.B. Develop deeper rather than surface questions.C. Reduce the time of teacher talking.D. Provide meaningful student feedback.E. Create flexible, not rigid student groupings.
And the answer is . . .
Source: ASCD Mini-Brief “ED Pulse” Aug. 13, 2015
Feedback Conversation!
About Feedback: Purpose: to provide learners about current state of
knowledge/performance Guide them to learning goal Give them info on what they do/do not understand Is their performance going well or not How should they direct future performance
Ambrose, S.A., Bridges, M.W., DiPietro, M., Lovett, M.C., Norman, M.K. (2010) How learning works: 7 research-based principles for smart teaching. San Franciso: Jossey-Bass
Feedback Analogies
Maze – No feedback, lost! Diet – No feedback, weight loss uncertain! “F” vs. “Not Yet”
Ineffective Feedback Strategies
Grade or numerical score with out additional feedback Too much feedback – overwhelms students, fails to
communicate which aspects of performance deviates most from goal
Not targeted – Feedback does not relate back to learning outcome of assignment
Effective Feedback StrategiesIn your groups: Designate a note taker, presenter What are various feedback strategies you use? Or: How do you provide your students with
• More Feedback?• More Timely Feedback?• More Useful Feedback?
When called on, presenter shares top strategies SAAC will record, organize, and send out to all
participants
And now . . . ASSESSMENT JEOPARDY!
Recommended