The document describes Amy Gratz's work assessing instruction services at Mercer University's Jack Tarver Library from 2012-2015. It outlines efforts to gather student and faculty feedback, define student learning outcomes, design an assessment program, and implement various assessment methods on a four-year cycle. Challenges included low response rates to some surveys and piloting new tools like pre- and post-tests. The assessment work helped identify strengths and areas for improvement in library instruction.
3. Spring/Summer 2012
Created new student
form
Continued peer
observations
Your Class Level (circle one):
First-year Sophomore Junior Senior Graduate Student Other
1. Please rate the amount of hands-on work time allotted in this session
(circle one)
Too Much About Right Too Little
2. Please rate your confidence with using the tools demonstrated today
Confident Somewhat Confident Not Very Confident
3. How helpful will the skills and concepts discussed today be for your
assignment (check one)?
□ Absolutely essential – I could not complete the assignment without
them
□ Useful – they will make the assignment easier
□ Somewhat helpful – some of the skills/concepts will be helpful, but not
all
□ Barely helpful – most of the skills/concepts I already knew or won’t use
□ Useless – I already knew everything, or the skills/concepts won’t help
with my assignment
4. What concept from today’s class are you still working to understand?
5. What skills/concepts discussed today were already familiar to you?
6. What didn’t work well about today’s session, and how can we improve?
Other comments?
4. Fall 2012
Began article about assessment at Tarver Library1
Delved into past practices at Tarver
Started in 2002
Primarily student feedback
Learned about best practices from the literature
1For a full overview and references to specific articles and other literature:
Gratz, A. and Olson, L. T. (2014). doi: 10.1080/10691316.2013.829371
5. Best Practices for Assessment
Must be tied to Library and University goals
Gilchrist & Oakleaf, 2012. An Essential Partner: The Librarian’s Role in Student Learning
Assessment. http://www.learningoutcomeassessment.org/documents/LibraryLO_000.pdf
Should be done on multiple levels
Radcliff, et al., 2007. A practical guide to information literacy assessment for academic
librarians. Westport, Conn: Libraries Unlimited.
Should use multiple methods
Tancheva, Andrews, & Steinhart, 2007. Library instruction assessment in academic
libraries. Public Services Quarterly, 3(1/2), 29-56. http://doi.org/10.1300/J295v03n01_03
Also recommended: Instruction & Program Design
Through Assessment by Gilchrist and Zald, 2008
6. Problem with Attitude Surveys
“At most, it provides information about how the
student perceives the librarian’s presentation…
What [it] has not provided is any indication of
whether the student participants have actually
learned anything.”
Colborn, N. W. & Cordell, R. M. (1998). Moving from subjective to objective assessments of your
instruction program. Reference Services Review, 26(3/4), 125-137. doi: 10.1108/00907329810307821
7. 2013-2014: Designing the Program
Mission Statement:
The Tarver Library Instruction Program supports the
mission and curricula of Mercer University by
teaching the information literacy skills essential for
creating well-researched papers, presentations, and
other projects, empowering all of our community
members in their academic, professional, and
personal lives.
8. 2013-2014: Designing the Program
Student Learning Outcomes (SLO’s)
Upon degree completion students will:
Determine the nature and extent of the information
needed.
Access needed information effectively and efficiently.
Evaluate information and its sources critically and
investigate differing viewpoints.
Understand various economic and social issues
surrounding the use of information and access and
use information ethically.
9. 2014-2015: Creating SLO’s
Focused on our most commonly taught
courses
Decided not to develop any for subject area
classes
11. Selected Finalized SLO’s
INT 101 Instruction
Students identify appropriate academic sources
Students generate an individual list of
applicable key search terms
Students access and use multidisciplinary
resources to locate information
12. Summer 2015
Needed to get colleagues more interested in
assessment
Adapted an instruction activity for use in a
meeting
13. Instruction Program Assessment
“Ultimately, the goal of all instruction and assessment efforts
is to engage in reflective practice” (Oakleaf)
How frequently should we
use that assessment method?
Assign each method you’ve
been given to one of the
different cycles posted
Color-coded:
Green = students
Yellow = faculty
Black = internal
Add your own if desired!
14.
15. Instruction Program
Assessment Cycle, 4-year Rotation
2015-2016 2016-2017
Full Year Student in-class survey Full Year Student in-class survey
Faculty in-class survey Faculty in-class survey
Pilot pre/post tests Pre/post tests (students)
Fall Only Student end-of-semester survey Fall Only Student end-of-semester survey
Faculty end-of-semester survey Faculty end-of-semester survey
Spring Only Preceptor Focus Groups Spring Only Faculty interviews/focus group
2017-2018 2018-2019
Full Year Student in-class survey Full Year Student in-class survey
Faculty in-class survey Faculty in-class survey
Pre/post tests (students) Pre/post tests (students)
HEDS Annual Survey Fall Only Student end-of-semester survey
Peer Observations Faculty end-of-semester survey
Fall Only Faculty end-of-semester survey Spring Only Student focus groups
16. 2015-2016 Successes
2015-2016
Full Year Student in-class survey
Faculty in-class survey
Pilot pre/post tests
Fall Only
Student end-of-semester
survey
Faculty end-of-semester
survey
Spring
Only Preceptor Focus Groups
Student in-class survey
Faculty end-of-semester
survey
Focus Group with
Preceptors
17. 2015-2016 Challenges
2015-2016
Full Year Student in-class survey
Faculty in-class survey
Pilot pre/post tests
Fall Only
Student end-of-semester
survey
Faculty end-of-semester
survey
Spring
Only Preceptor Focus Groups
Faculty in-class survey
Pre/Post Tests
Student end-of-semester
Survey
18. Closing the Loop
Internal feedback
Informal sharing with faculty colleagues
19.
20.
21. Images Used
Slides 1 and 20: Mills. (2014). Tree of life. https://flic.kr/p/rcu88b
Slide 2: Mibby23. (2013). Looking back. https://flic.kr/p/iMJAfr
Slide 3: Cornwall, N. (2013). New life. https://flic.kr/p/ejeMh6
Slide 4: Accheri, C. (2014). Ta Prohm. https://flic.kr/p/oXzeiT
Slide 5: Mennerich, D. (2013). Bikaner IND – cenotaphs devikund sagar 04.
https://flic.kr/p/jC6cQu
Slide 6: Jutte, T. (2013). National Geographic, Ter Apel Monastery, Groningen, Netherlands – 1551.
https://flic.kr/p/jKjfTU
Slide 7: Tazewell, C. (2007). Spiral. https://flic.kr/p/3f5vcq
Slide 8: Delp, J. (1999). Mount Katahdin, Maine. https://flic.kr/p/5rXe7c
Slide 9: kc ma. (2016). Seedling. https://flic.kr/p/Eni9Rh
Slide 10: Holzman, L. Ladder. https://flic.kr/p/ApJ9
Slide 11: bambe1964. (2011). Scaffold. https://flic.kr/p/aB3wdc
Slide 12: Kleinfield, A. (2008). Girl runs up San Francisco’s 16th Avenue tiled steps.
https://flic.kr/p/58rHHw
Slide 13: JogiBaer2. (2011). Post-It. https://flic.kr/p/9jp2qo
Slide 14: chaseiv57. (2012). 6172004275_b8dcba694d_b-1. https://flic.kr/p/dcjiJx
Slide 15: Cheng, B. (2014). Light tunnel. https://flic.kr/p/qPWPgZ
Slide 16: Scott, G. (2007). Positive thoughts, Mr. Glen. https://flic.kr/p/3RbWwn
Slide 17: Eric. (2005). Frustration. https://flic.kr/p/5Y57G
Slide 18: when_night_falls. (2013). Loop. https://flic.kr/p/ffi186
Slide 19: Clark, T. (2013). The open road. https://flic.kr/p/h6FC2E
Notas do Editor
Introduction of self
Evolution of assessment program – last 4 years
Starting from a point of no knowledge
Told assessment needed
Started by looking back – peer observations, student feedback form
Asked colleagues about old form – not helpful
Created new form based on what I found at other libraries
Continued peer observations, no set method to follow, so also “new”
Simultaneously, started working with Lee Olson on article
Delved into past practices
Learned about best practices in field – strong influence from a few
First – Instruction assessment tied to library and university goals
Second – multiple levels, primarily one-shots and whole program for us
Third – multiple methods, focus groups, surveys
Attitude survey – what had been used at Tarver in past
Doesn’t focus on what students learned
I think that’s the most important thing
Revised tools and experimented with new methods
Focus on creating mission statement/goals for program
Developed alone, approval from colleagues
Moved on to Student Learning Objectives – more collaborative
Four outcomes for undergraduates
based on ACRL competency standards
Just prior to hearing about framework – stuck with this with plans to change later
Followed up with smaller goals – scaffolding
Started with first-semester writing courses
Most common instruction request
Followed with rest of gen ed writing program
Subject-specific too specialized.
Long process since unfamiliar with SLOs
Last summer made final changes.
Mess – notes from last meeting
Moved from creating SLOs to designing assessment
Only way to know if we’re meeting the goals
Final form of scaffolding for one class
General – designed to be flexible for topic
Brief – room to cover additional goals
Learned colleagues weren’t interested in assessment on its own
Encouraged active involvement in planning process
Sense of ownership
Adapted instruction activity
Slide explaining activity
Stickers with methods/users
Indicate frequency – every semester, every year, every few years, revisit later, or never
Ended up with a mess
Interesting and useful approach
Condensed, conferred with colleagues
Settled on new assessment cycle
Streamlined version
4-year rotation, differing frequencies
Multiple methods
Different users each year
Balanced – avoid over-burdening users or ourselves
Implemented in fall – some successes
Student in-class survey – useful information, positive feedback
Faculty end-of-semester survey – good response rate, useful information for planning in Spring
Focus Group – interesting data, still discussing
Challenges
Faculty in-class survey a “bust” – little data
Pre/post-tests not used
Student end-of-semester survey EXTREMELY low response rate
Closing the loop on last year
Internal feedback on what colleagues thought of plan
What changes we want
Sharing some data with non-library faculty
Definitely changes to implement!
Changes to tools themselves
Dropping faculty in-class survey
Possible changes to other forms
Discussions with wider set of library colleagues
Expanding implementation across campuses
Different student populations – different needs
Final point – assessment is always evolving
No end point or target
Pleased with work done so far
Always looking ahead
Students, libraries change – assessment must change with them