Upload
ella-armstrong
View
216
Download
0
Tags:
Embed Size (px)
Citation preview
Assessing Information Literacy with SAILS
Juliet RumbleReference & Instruction Librarian
Auburn University
Information Literacy: from outcomes to assessment
The “standard” definition of information literacy:“able to recognize when information is needed andhave the ability to locate, evaluate, and use effectivelythe needed information”
--ALA Presidential Committee on Information Literacy Final Report, 1989
ACRL Information Literacy Competency Standards for Higher Education, 2000 ACRL Objectives for Information Literacy Instruction, 2001
Needed: a standardized instrument to assess information literacy skills for the purposes of:
--programmatic improvement--accountability
Purpose: develop an instrument for programmatic level assessment of information literacy skills that:
Is standardized
Contains items not specific to a particular institutionor library
Is easily administered
Has been proven valid and reliable
Assesses at institutional level
Provides for both external and internal benchmarking
<www.projectsails.org>
Project SAILSStandardized Assessment of Information Literacy Skills
Project Background
An initiative of Kent State University
Fall 2002: Recipient of 3 year IMLS grant to support development and testing of instrument
Spring 2003: Collaborative partnership with ARL for 3 testing phases
Summer 2005: End of Phase III testing
Fall 2005-- Assessment of test instrument Summer 2006
Summer 2006: Roll out commercial instrument?
SAILS test instrument
Multiple choice test
Measures the performance of cohort groups, not individuals
Measurement model: Item Response Theory
ACRL’s Objectives for Information Literacy Instruction regrouped by SAILS team into 12 skill sets
Results reported at two levels of specificity:--4 of 5 ACRL Info Lit Competency Standards--12 skill sets (each analyzed separately by demographic group)
Objective: Identify test instrument to satisfy campus-wide assessment requirements.
1998 SACS Criteria for Accreditation --Section on information literacy (5.1.2.)
Office of Institutional Research and Assessment -- Oversees unit-level assessment activities -- Responsible for implementing a plan for assessment of student learning outcomes in general education
Background to Auburn’s involvement with SAILS
Objective: Use test data to help establish dialogue with campus stakeholders.
Information literacy not just a “library thing”--cuts across disciplines --cumulative: needs to be strategically integrated into
curriculum
Need to target curricular planning groups --e.g. Core Curriculum Oversight Committee
Share results with groups responsible for core curricular outcomes
--English Freshman Comp program--Freshman Year Experience courses
Auburn and SAILS (cont.)
Clearly defined learning outcomes must be paired with assessment tools that demonstrate whether learning occurred
--Are students learning what we thinkthey’re learning???
Need assessment tool to identify areas of strength and weakness
--Where do we focus our efforts?
Auburn and SAILS (cont.)
Objective: Use test data to improve library’s instruction program.
What we learned from the SAILS data….
In all three testing phases– and on all standards and skill sets– the average Auburn student scored at about the same level as the average student from all institutions combined.
Our Phase One test population was the only group that we could be certain received library instruction. They were also the only group to perform higher than the national benchmark on all standards and skill sets. However, observed differences were not statistically significant.
Statistically significant results were reported for several cohort groups’ performance on specific skill sets for Phase III testing.
What the numbers don’t tell us….
The national “benchmarks” associated with standards and skill sets do not indicate “mastery” of information literacy
Scores are not based on success in actually performing tasks associated with learning outcomes (although they are intended to be predictors of success).
Auburn’s test results do not track development of cohort groups’ skills over time (no longitudinal study was conducted).
Despite demographics collected, there’s still a lot we don’t know about cohort groups that we’re comparing.
-- E.g.: With the exception of Phase I, we don’t know whether test takers have received library instruction
--E.g.: We don’t know how much (and what kind of)instruction in research skills students may have
received in other classes.
Sharing resources and expertise with other campus groups charged with programmatic assessment is a key to success.
Programmatic assessment involves a serious commitment of time and money. Support must come, not only from individual faculty members and departments, but also from university administration.
The bottom line: Assessment– as well as information literacy skills—must be strategically integrated into the curriculum.
-- Need to cultivate a culture of evidence
What we learned about doing assessment
In the coming year, the SAILS team at Kent State will: -- Assess reliability and validity of test instrument-- Address other technical and administrative issues-- Consider: “Does the SAILS instrument measure information
literacy skills in a useful way?”
At Auburn University Libraries: --Identify patterns and trends in test data and share with librarian instructors & individual departments--Consider data results as we discuss future directions for the instruction program--Consider what other kinds of information we’d like to collect about student learning--Reach out to Freshman Year Experience groups
What next?
Final Thoughts
We have other measurement instruments to choose from now….--Educational Testing Service’s Information Communications Technology (ICT) Literacy Assessmentis another national benchmarking tool.--A number of regional and campus initiatives
The key question: Which assessment tool(s) provide us withinformation that best serves our user groups?
In all likelihood, we need a variety of different assessmenttools.
Summative: addresses curricular- and program- level learning outcomesFormative: addresses learning objectives for specific instructioncontexts