This document provides an outline for a presentation by Sheila Webber on evaluating the impact of information literacy education. It discusses the importance of considering context, including the university's strategies and the subject discipline. It presents examples of desired learning outcomes for information literacy from interviews with academics in different disciplines. The document emphasizes that information literacy education is carried out across disciplines and that indicators of its impact cannot be evaluated through a single training session but must consider longer-term outcomes like independence of thought and critical thinking.
business environment micro environment macro environment.pptx
Information Literacy Outcomes by Discipline
1. Information Literacy in the
Curriculum - Real and
Realistic Aims
Sheila Webber
Department of Information Studies
University of Sheffield
March 2008
Sheila Webber, 2008
2. Outline
• Importance of context
• Context 1: The University’s strategy
• Context 2: the subject discipline
• Exercise: identifying outcomes and
indicators
• Feedback
Sheila Webber, 2008
3. Importance of context in
evaluating teaching and
investigating impact
Sheila Webber, 2008
4. Sharpe et “There appears to be little value in
another review which asks ‘do
al, 2006 blended approaches improve
highlighting learning?’ and which will
contextual predictably give an answer ‘it
depends’.” (p8)
nature of
education
blended approaches = using
both face to face and e-
learning
Sheila Webber, 2008
5. • Identify why you wish to assess impact
– Educational reasons
– Marketing reasons
– Administrative reasons
– Personal reasons
• Whose agenda are you driven by?
– Your own
– Your institution
– Your students
– Your academics
– Your profession
– Some idiot?
Sheila Webber, 2008
6. • Identify what is the focus e.g.
– student learning of information literacy;
– student learning of another subject;
– Improvement in some other aspects of the student experience (e.g.
confidence, presentation skills, employability)
– student perceptions of the class and institution (e.g. in the UK we may
make improvements with the aim of improving responses in the
National Student Survey)
– approaches to teaching (e.g. Problem Based Learning; Constructivist
approach)
– use of a particular method or channel (e.g. online tutorial)
– efficiency gains of some kind (e.g. doing things cheaper or quicker)
• … and what is the specific question to investigate?
Sheila Webber, 2008
7. • Where are you undertaking the study and who is
involved: what is the context of the study?
• How are you going to evaluate? This flows from the
answers to all the other questions
Sheila Webber, 2008
8. Learn from mistakes in current literature
• Too great a preference for quantitative & quasi-experimental
research designs
• Detailed investigation of student conceptions and experience
not common
• Pre/post tests described, but…
• … often not sufficient detail about what went on between tests
to make any sense of the results (it is not like measuring
effects of drugs on disease!)
• Lack of detail generally in describing course context & exactly
what happened
• May be questionable whether tests are really testing what they
say they test
• Also often gaps in describing aims and methodsSheila Webber, 2008
9. These criteria [for good-quality qualitative
research] are no less rigorous than those
used to assess quantitative data; they are
Given, simply different, and require different steps
and measures to ensure quality data. These
2007: 20
steps may include: prolonged engagement
in the field; persistent observation;
triangulation of methods; negative case
analysis; peer debriefing; member checks;
and many other techniques that are often used
together.
Sheila Webber, 2008
10. Context 1: the university’s
strategies and evaluation
process
Sheila Webber, 2008
11. My Department’s learning &
teaching evaluated through
• Key Performance indicators:
– Outcomes of the National Student Survey
– Sheffield’s Student Satisfaction Survey
– Students progression and attainment (pass rates, degree
classifications etc.)
– External and internal reports e.g. accreditation
• Other surveys and feedback from students
• Progress against our Departmental Learning
Teaching and Assessment Strategy & against stated
aims for our programmes
Sheila Webber, 2008
12. National Student Survey (a questionnaire)
has items such as
• Staff have made the subject interesting
• The course is intellectually stimulating
• The criteria used in marking have been clear
• I have been able to contact staff when I need to
• The course is well organised
• The library resources and services are good enough for my
needs
• The course has enabled me to present myself with
confidence
Anything that can demonstrably help improve these
is popular with academics and the university!2008
Sheila Webber,
13. Additionally
“4. Demonstrate the core capabilities and skills of
information literacy, interacting
confidently with the nature and structure of
Character- information in their subject and handling
istics of a information in a professional and ethical manner;
Sheffield
5. Explore the history of and challenge the
Graduate
processes of knowledge creation, applying
creativity, enterprise and innovation, to push
against the boundaries of current
practicequot;
http://www.shef.ac.uk/content/1/c6/04/83/65/lta-strategy2.pdf
Sheila Webber, 2008
14. This means that…
• Reasons for evaluating information literacy in
relation to something else (success in a subject
etc.) but also
• Reasons for evaluating the quality and impact of
learning & teaching information literacy (rather than
just evaluating it in terms of the impact it has on
something else)
• You will want to do the latter anyway …
Sheila Webber, 2008
15. Context 2: the subject the
students are studying
Sheila Webber, 2008
16. • Three-year Arts & Humanities Research Council
(AHRC) - funded project (Nov 2002- Nov 2005)
To explore UK academics’ conceptions of,
and pedagogy for, information literacy
• Sheila Webber; Bill Johnston; Stuart Boon
• Phenomenographic study: interviewing 20
academics in each of 4 disciplines to identify
variation in conceptions (visited 26 universities to
collect 80 interviews)
Sheila Webber, 2008
17. Will present …
• Conceptions of information literacy, as identified
through phenomenographic analysis
• Lists of desired learning outcomes for information
literacy
– This was one of the questions in the interview
– We coded up interview transcripts using text analysis
software
– N.B. we coded every mention – some people mentioned
an outcome more than once
– In each case no. of interviewees = 20
Sheila Webber, 2008
18. Key point here is …
• There was variation within and between disciplines
• Can see (obviously) connection between conception and
desired outcomes for information literacy
• Some outcomes important to all especially
– Being able to access information
– Evaluating information
– Critical thinking
• Some vary e.g. personal development (English);
Employability aspects (Marketing, Engineering)
Sheila Webber, 2008
19. Marketing: Information literacy as…
1. Accessing information quickly and easily to be aware of what’s
going on
2. Using IT to work with information efficiently and effectively
3. Possessing a set of information skills and applying them to the
task in hand
4. Using information literacy to solve real-world problems
5. Becoming critical thinkers
6. Becoming a confident, independent practitioner
Sheila Webber, 2008
20. Outcomes for IL - Marketing
Work ethic
Wider thinking
Use/applying information
Understanding info/role of info
Self-sufficiency
Self-awareness
Search skills/tools
Problem-solving skills
IT Skills
Outcome
Information sources use
Information management
Getting them to think
Finding information sources
Evaluation
Disciplinary knowledge
Critical thinking/analysis
Creativity
Communication/presentation skills
Career/lifelong learning skills
Basic information/study skills
Access information
0 5 10 15 20 25
Times mentioned
Copyright Boon, Johnston and Webber
Sheila Webber, 2008
21. English: Information literacy as…
1. Accessing and retrieving textual information
2. Using IT to access and retrieve information
3. Possessing basic research skills and knowing how
and when to use them
4. Becoming confident, autonomous learners and
critical thinkers
Sheila Webber, 2008
22. IL Outcomes - English
Assimilate information
Transferable skills
Self-awareness
See value of info
Presentation skills
Personal development
Produce academic output
Organise/ manage info
Learning to write
IT skills
Outcome
IL skills
Evaluate information
Ethical use of info
Able to do research
Cultural awareness (of discipline)
Critical thinking
Confidence
Best education possible
Basic search skills
Awareness of research methods
Autonomous learning
Analysis of info
Access/ retrieval
0 5 10 15 20 25 30
Times mentioned
Copyright Boon, Johnston and Webber
Sheila Webber, 2008
23. Civil Engineering: Information literacy as…
1. Accessing and retrieving data and information
2. Applying and using information
3. Analysis and sense making
4. Creating, and incorporating information into a professional
knowledge base e.g. “get them to the point that they can be
literate in their discipline and its wide, wider context….”
(CENG19)
Sheila Webber, 2008
24. Outcomes for IL - Civil Engineering
Work ethic/professionalism
Transferable skills
Reflexivity/self-awareness
Problem-solving skills
Personal development
Output
Modelling skills
IT skills
Info management
Outcome
Evaluation
Disciplinary core/fundamentals
Critical thinking/analysis
Confidence
Communication skills
Career skills
Awareness of research methods
Awareness of information sources
Autonomous learning/independence of thought
Appreciate significance of information literacy
Access
0 5 10 15 20 25 30
Times mentioned
Copyright Boon, Johnston and Webber
Sheila Webber, 2008
25. Chemistry: Information literacy as…
1. Accessing and searching chemical
information
2. Mastering a chemist's information skill set
3. Communicating scientific information
4. An essential part of the constitution/
construction/ creation of knowledge
Sheila Webber, 2008
26. Outcomes for IL - Chemistry
Work ethic
Wider thinking
Understanding
Transferable skills
To have fun
Self-sufficiency
Self-awareness/reflexivity
Problem-solving skills
Presentations
Personal development
Outcome
Output
Numeracy skill
IT skills
Independence of thought
Evaluation
Disciplinary core knowledge
Database skills
Critical thinking/appraisal
Confidence
Awareness of research methods
Awareness of research literature
Applying skills/knowledge
Access info
0 5 10 15 20 25
Times mentioned
Copyright Boon, Johnston and Webber
Sheila Webber, 2008
27. Observations
• Information literacy education is not just carried out
by the library
– Are you interested in looking at the impact of the
“library” or of IL?
– If looking at IL: try to establish the whole picture
• Some outcomes are obviously connected with
information activities (e.g. accessing information)
but many are not
Sheila Webber, 2008
28. How do you find out what the indicators
should be?
• Investigate this by talking to the people concerned
e.g. lecturers, students, senior managers
• Open questions about what they see as important
outcomes and impact
• Probing questions about how they can tell the
outcome is achieved
• Asking, for example, academics and students and
librarians and careers advisors
• So – this investigation is taking place to enable you
to decide what indicators to select
Sheila Webber, 2008
29. How do you find out what the indicators
might be?
• Need to evaluate which indicators are most important
& which are feasible
• Refer back to your motives to decide criteria for
“important”, “feasible”!
• Next stage is to collect and analyse data in relation to
these indicators
• The approach you take to gathering & analysing data
will depend on what the outcome & indicators are
Sheila Webber, 2008
30. For example ….
• Accessing information
• Independence of thought
• Confidence
• Critical thinking
• Being able to transfer the skills to new contexts
Most of these cannot be evaluated by looking at one
short training session
Sheila Webber, 2008
31. What are the outcomes that matter for
your institution & partners?
1. Think about one set of people you work with (e.g. academics
in one discipline). What are their key outcomes for their
students; ones that are directly about information literacy, or
which information literacy might contribute to?
2. Do you need to find out more about what the outcomes are?
How will you do this?
3. How will you evaluate achievement of those outcomes?
• What are your key questions?
• What indicators are appropriate?
• What research approaches will you take? What form will your
data take? How will you gather and analyse it?
Sheila Webber, 2008
32. 1. Get into groups
2. Think about the questions individually (5-10 minutes)
3. Discuss what the outcomes might be and/or how you
could find out what they are: note down your ideas on
overheads (15 minutes)
4. Discuss what indicators might be and how you might
gather and analyse information on them: note down (15
minutes)
5. Return for feedback. A few groups will be asked to
present.
Sheila Webber, 2008
34. References
• Bordorano, K. and Richardson, G. (2004) “Scaffolding and reflection in
course-integrated library instruction.” Journal of academic librarianship,
20 (5),391-401. Example of an article where aims, methods and process
of the evaluation are clearly described.
• Given, L. (2007) “Evidence-based practice and qualitative research: a
primer for library and information professionals.” Evidence based library
and information practice, 2 (1), 15-22.
http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/155/237
• Glenaffric Ltd. (2007) Six steps to effective evaluation: a handbook for
programme and project managers. Bath: JISC.
http://www.jisc.ac.uk/media/documents/programmes/digitisation/SixStep
sHandbook.pdf
This is about project evaluation, but the sections on planning and data
gathering are useful.
Sheila Webber, 2008
35. References
• Mayes, T. (2006) L E X: The Learner Experience of e-
Learning: Methodology Report. JISC.
http://www.jisc.ac.uk/media/documents/programmes/elearning
pedagogy/lex_method_final.pdf
Description of using Interpretative Phenomenological Analysis
“a method for exploring how participants make sense of their
own experiences”
• Pritchard, J., Stratford, R. and Hardy, C. (2004) Training
students to work in teams: why and how? York: LTSN
Psychology
http://www.psychology.heacademy.ac.uk/docs/pdf/p20040422
_training_students_teams.pdf
Describes a training day and discusses in some detail the way
it was evaluated and issues to do with evaluation of
educational interventions.
Sheila Webber, 2008
36. References
• Sharpe, R. et al (2006) The undergraduate experience of
blended e-learning: a review of UK literature and practice.
York: Higher Education Academy.
http://www.heacademy.ac.uk/ourwork/research/litreviews/200
5_06.
• Vezzosi, M. (2206) “Information literacy and action research:
An overview and some reflections.” New library world, 107
(7/8),286-301) Keywords: Academic libraries
• Zuber-Skerritt, O. (Ed),New directions in Action Research.
London: Falmer Press .
Sheila Webber, 2008