1. Social science research
methods for libraries
Diane M. Rasmussen Pennington PhD FHEA FRSA
Senior Lecturer in Information Science and Course Director, MSc ILS
Department of Computer and Information Sciences, University of Strathclyde
Twitter: @infogamerist; diane.pennington@strath.ac.uk
Supporting the social sciences:
A masterclass for library & information professionals
13 May 2019
2. • You are interested in first year university
students’ use of library databases as
information seeking tools in their coursework.
– You want to learn more about what they “do” with
the tools and what the tools mean to them as
seekers of information.
– You are also interested in the challenges they
face in using these tools.
• Write research questions that would address
this topic.
Exercise 1 (Example adapted from
Lisa Given, ALISE Academy 2013)
6. Why should librarians do research?
“A professional field should have an
established and growing body of research
evidence to help support decisions and
knowledge within the field [italics added]”
(Koufogiannakis & Brettle, 2016, p. 29).
7. The case for doing research
“No Authorities had ascertained the public’s view on
library apps, suggesting a lack of forward planning. A
simple online questionnaire could be developed to
gauge patrons’ interest, producing valuable data to
inform future policy and implementation. Two-way
communication is a key feature of Library 2.0 and
3.0 and should be indigenous to library services”
(Kerr & Rasmussen Pennington, 2018, p. 245)
8. What is evidence or being evidence
based? (slide from Brettle, 2018)
• Being evidence based is about using
evidence:
– To help you make decisions about your
services
– To help (or influence) others to make
decisions about your services
• Can do this via research, local evidence,
user evidence
• Impact is about using evidence to
demonstrate the difference or change that
libraries can make but it can be difficult to
capture
• Build up an evidence base from rigorous
studies and routinely collected data
9. CILIP’s take on being evidence based
(slide from Brettle, 2018)
• “As the professional association for people working in
information and knowledge management and
libraries, we have to ensure that our own work and
decisions are evidence-based”
– Nick Poole, Chief Executive CILIP
• Report summarizing the research evidence
– Brettle, A. and Maden, M. (2016). What evidence is there
to support the employment of professionally trained
library, information and knowledge workers? A systematic
scoping review of the evidence. London: CILIP. Available
from www.cilip.org.uk/valueofLIKworkers)
• Report scoping the requirements for the development
of a sector wide research/evidence base portal
– Dalton, P. and McNichol, S. (2018)
10. Research evidence is NOT:
• “How we done it good” (Wilson, 2013, p.
113)
• Opinions – “I think that people need/want
x”
• Results of committee meeting discussions
• Asking your 11-year-old son
11. • Environment: rate of change, poor access to
evidence base, cultural barriers
• Evidence: limitations of evidence base,
inappropriate research orientation
• Workplace: lack of time, lack of money, lack
of infrastructure, lack of support
• Profession: leadership, lack of research
culture, professional characteristics,
communications issues, need for training,
failure to implement results
Barriers to EBLIP
(Booth, 2004; Wilson, 2016)
12. • Lack of existing actionable research, ‘remote’ and
slow academic ILS research
• Chance to increase profile, professionally and
organisationally
• “a person engaged in the practice of a skilled
profession who also conducts research” (p. 83)
• Focus on users, reflective practice
• Curiosity, interest in growth, belief in ability, to
develop, access to mentor, peers, culture
Needs for and characteristics of a
practitioner-researcher (Wilson, 2016)
13. How to collect, analyse, and use
research evidence
• Does it exist already? Do a literature review.
If not, you will need to create it.
• Write good research questions (Booth, 2006)
• Consider how the questions can best be
answered with appropriate methods
• Collect the research data
• Analyse it and answer your questions
• Implement and disseminate it
14. What is a “literature review”?
• Finding out what has been written about a topic
• Learning what methods have been used to study the topic
• Thinking about how the existing literature connects to each
other and to your work
• Reading and synthesising critically
• The written version should not be a “book report”
– e.g. “Pennington said this. Hall said this. Tait said this…”
• It should not only include work that supports your own views
on the topic
• It should evolve with your research
• Guidance for your conceptual framework; should include both
topical research and theoretical frameworks (theories)
• For help with your literature review:
Jesson, J. K., Matheson, L., & Lacey, F. M. (2011). Doing your
literature review: Traditional and systematic techniques.
Thousand Oaks, CA: Sage.
15. Always start with the problem statement
(Hernon & Schwartz, 2007)
• The 3 elements of a problem statement:
1. Lead-in: What’s going on here? What’s the general
premise?
2. Originality: What does this study do that others
haven’t done?
3. Justification/significance: Answer the “so what?”
question
• Can be as short as 3 sentences, or longer
• Elements should intertwine and “suggest an
unsettled or perplexing state” (p. 309)
• Literature review, RQs, and methodology stem
from the problem statement
16. What can we learn from qualitative
inquiry? Quantitative inquiry?
17. • Different approaches lead to different
types of data collection and analysis
The ongoing 60 year debate in social
science research
(Matthews &
Ross, 2010, p.
142)
18. Choosing your collection method
• Methods should always tie back to your RQs
• What kinds of data might you want to collect
in order to answer the following RQs?
– How can the main campus library be best
designed for undergrads?
– What do middle-aged women do on Facebook?
– What are seniors’ preferences for finding health
information?
20. Quantitative studies
• Count or otherwise mathematically measure
structured data
• Test a hypothesis or theory
• Claim to be objective
• Positivism: empirical facts are key
• Attempt at reducing or eliminating bias
• Some common quantitative methods in LIS:
– Citation analysis/bibliometrics
– Search query log analysis
– Simple statistics (mean, SD, chi-square, t-test)
– Quantitative content analysis
21. • The process is linear
• Research questions/problems are quite straightforward
– Can users’ tweets determine their mood?
• What are the variables you are testing?
• May or may not have a hypothesis
– Hypothesis states what the researcher predicts/is testing
– “Consistently negative sentiments expressed in users’ tweets are
associated with negative events in users’ lives.”
• Collect and analyse data
• Does the data support the hypothesis?
– Note: we generally cannot prove anything if it is behavioural
• Write it up
• See the example provided (handout)
Quantitative research questions,
hypotheses, and studies
22. Qualitative studies
• Are subjective; data is interpreted through the
researcher
• Data is semi-structured or unstructured
• Allow theory to emerge inductively from the data
• Can generate hypotheses, but do not test them
• Performed in context of the participants/data
• Can examine perceptions; meanings behind behaviors
• Triangulation is common: combine multiple data
methods to gain multiple perspectives
– Observation and focus groups
– Quantitative survey and interviews
23. Qualitative research questions
• Ask “why”
• Must be answerable and operationalisable
• Are neutral
• Are not “yes/no” questions
• Leave room for unexpected answers
• Inform your methodology decisions: what are
the most appropriate data collection and
analysis methods to answer these questions?
24. What’s wrong with these RQs for
qualitative inquiry?
• Do first year students have library anxiety?
• What factors cause library anxiety to occur
in first year students?
• Why is library anxiety so hard for first year
students to overcome?
• What should be done to stop library
anxiety?
25. Elements of a conceptual framework
(Ravitch & Riggan, 2017, p. 9)
26. The (iterative!) qualitative research
process
• Write your problem statement
• Do a literature review
• Consider your conceptual/theoretical framework
• Write your research questions
• Determine your methodology
– Data collection methods
– Data analysis methods
• Collect your data
• Analyse your data
• Write it up
27. How to conduct a qualitative, one-on-one
interview that will get you good answers!
• In-depth, but how many participants do you
need?
• Build rapport
• Be aware of body language
• Question order and question wording are
important considerations
• Standardised/closed interviews minimise the
researcher’s influences, but…
• Semi-structured/open-ended interviews allow
for less structured exploration
28. Interview example: Mental health
and wellness in massively
multiplayer online role playing
games
• Different modalities led to different
responses
• Different researchers interviewed
differently with same interview guide
• Rapport was essential to authentic
access… too authentic?
29. Focus groups
• Originated in marketing; fairly new to
scholarly research
• Successful moderation is essential
• Make sure they don’t feed off one another
• Transcription can be challenging
• Can work well at exploratory phases
30. Focus group examples
• Young adult online mental health access
– Truly ethical and authentic, or university-
approved ethical?
– “Let’s just say I hugged the phone”
• Online focus groups with Wordpress
– Do focus groups need to be held in person?
– How does anonymity influence authenticity?
31. Open-ended survey questions
• More participants possible, but do you need
deeper questioning?
• “What do you think about…”
• “Is there anything else you would like to say?”
• Can be difficult for people to respond with no
guidance
• Can also provide useful insights
32. Open-ended survey examples
• My PhD dissertation: news photographers
aren’t idiots about describing and archiving
pictures
• Canadian university students’ use of online
mental health information sources
– “I did not realize so many options are available”
but also
– “smells like dirty feet”
33. Autoethnography
• Write thoughts and experiences in field notes/journals
and analyze them
• Can produce intense self-reflections… or not much at
all
• The “struggle to find an authentic voice – authentic
first to me, then to others who know me, and finally to
those who do not know me” (Patton, 2001)
• Case study: “‘Pass on what you have learned, Luke’:
Exploring experiences of research assistantship”
– Ethics and its influence on authenticity, context
34. Observation
• What do you hope to learn?
• What is/is not observable?
• What about the surroundings?
• Obtrusive or unobtrusive, and why?
– What does this method tell us and not tell us?
• http://www.youtube.com/watch?v=w5pn48wzBuw
35. Qualitative data collection methods (that
don’t involve artificial human interactions)
• Archival/document analysis
– What is the source (provenance)?
– Primary or secondary?
• Physical trace measures
– Accretion: things added
– Erosion: things removed
• Can be highly naturalistic, but it depends
36. http://digital.library.unt.edu/ark:/67531/metadc86519/m1/17/
Source? Accretion? Deletion?
‘Although the individual pages of this
manuscript are not numbered, the copyist
indicated the number of measures for each
section of music at its end on the right side
of the page margin. The ink used from
p.[42] to the end of the manuscript is
considerably lighter that the one used for
the t.p. and pp. [1-41].
A short doxology that reads, "Laudamus te
Deo, e beata Maria Virgine, Finis," appears
at the bottom of the last page. Also, at the
top of that page there is a handwritten
entry in pencil concerning the composer,
which reads, "Bisso - maestro di cappella
del Duomo di Genova, del 1728 al 1755.“’
37. “Naturalistic” Internet data: Bloggers reconstructing
“authority”
In my genuine deep despair last week,
[partner said something encouraging]. I was
speechless. It made me cry. It slightly lifted
the fog surrounding my soul. [Blogger 10]
There is literature touting the benefits and
negatives of [a specific] therapy, as well as a
plethora of personal accounts to be found
on-line as to both. [Blogger 2]
38. What could influence participants’
responses?
• Incentives such as prizes or extra credit
• Self-selection and self-reporting
– Hawthorne effect
• The researcher’s experiences, beliefs, and
biases: what do we want to hear/what are we
hearing?
39. The researcher’s (inevitable) intrusion
• Our biases are inevitable because humans
design the instruments and we are biased,
but this is ok
• Use “objectivity” (how much?) and
triangulation
• Make sure that the results come from the
data, not your predispositions (Shenton,
2004)
• Intrusive methods – i.e. focus groups?
(Morgan, 1997)
– What, if any, methods are naturalistic?
40. Some final thoughts on qualitative data
collection/design
• We must put quality measures in place, but quality
qualitative data is also felt
• We cannot apply positivist or quantitative models
to “prove” our work, but they can inform each other
• During collection, get “in tune” with your dataset
and your participants, whatever that means…
– Access through rapport
– Removed but connected
• Saturation will show itself in time
• Be yourself; your research is a reflection of you
41. Many approaches to qualitative
data analysis
• Many sources: transcripts, field notes,
texts, open-ended surveys…
• And many ways of looking at the sources:
discourse analysis, qualitative content
analysis, textual analysis,
phenomenography…
• But they all share some commonalities…
42. “What do I do with all this data?”
• Once you have it, it can feel overwhelming!
• The goal of analysis is to narrow it all down into
categories that represent the common themes
• It is never a straight, quick path if it’s done correctly
• Immerse yourself completely in it.
• Read, write, stay open!
• Procedure (also see Bloomberg & Volpe, 2012):
– Organise the data
– Make the codes
– Code the data
43. Organising the data
• Throughout collection: transcribe, take notes
(during and after), reflect and write write write!
• First organise data and notes by unit (interview
participant; focus group; text)
• How will you physically do the analysis? (NVivo?
Note cards? Post-its? Highlighters? Excel?)
• What analytic tradition will you use? (Grounded
theory/constant comparison? Discourse analysis?)
44. Generating categories
• Reviewing the data in detail, and reflecting at every
stage will give you a big picture of the main themes
• Inductive codes/categories, as in grounded theory,
emerge from the data
– Name the categories
– They will probably start narrower and broaden
– Expect iterations
• Other coding approaches can be drawn from
existing models or theories: see if the data fits the
model
45. Coding the data
• The act of putting your data into those codes/categories
• You can do it, or you can have one or two others do it
– Intercoder reliability
• Coding options
– NVivo/other software
– Cutting it up and placing the quotes into piles
– Highlighting
– Sticky notes
– Or whatever works best for you
46. Reporting the results
• Explain your methodology and methods in
detail
• List each code as a heading
• Explain the rationale (stemming from your
data!) for the code, if inductive
• Provide supporting quotations that are in
context of your explanations
47. "I could be on Facebook by now": Insights from
Canadian youth on online mental health information
resources
Constructivist grounded theory: “discovered reality”
• Social media: popular, problematic
• Little agreement on online credibility
– “You can’t consult with the website to know if it is legit or not.”
• Poor mental health literacy
– “Others don’t want to be around” people with mental illness
• Mistrust and stigma
– "It can be difficult to share with people because you are afraid of being rejected when issues
are shared."
• Keep it simple
– “If I’m trying to find information for my friend, I don’t want to play
a game."
48. Mixed methods
• Mixed methods research combines qualitative and quantitative
methods
• Triangulation: using multiple data gathering methods can help
validate results and add insights
• Difficult to do in practice: time, culture, ethics…
• Quantitative and qualitative research are complementary… but
– Our “polarized views of reality” (Ma, 2012, p. 1859) in the epistemology
of each approach are difficult to reconcile
• Qualitative before quantitative
• Qualitative after quantitative
• Qualitative and quantitative together
• News photography image retrieval practices: Locus of control in two
contexts (my mixed methods PhD dissertation, 2006)
– The Pulitzer Prize-winning photographer, and sand
– Interviews triangulated quantitative and qualitative survey
49. • Refine the research questions
• How would you design a quantitative study
to research them?
• How would you design a qualitative study
to research them?
• How would you design a mixed methods
study to research them?
• Consider both data collection and analysis
Exercise 2: Back to the students
50. EBLIP10 – 17-19 June, 2019
University of Strathclyde iSchool!
51. • Booth A. (2004). Barriers and facilitators to evidence-based library and information
practice: An international perspective. Perspectives in International Librarianship, 1.
• Booth, A. (2006). Clear and present questions: formulating questions for evidence
based practice. Library Hi Tech, 24(3), 355-368.
• Glynn, L. (2006). A critical appraisal tool for library and information research. Library
Hi Tech, 24(3), 387-399.
• Kerr, A., & Rasmussen Pennington, D. (2018). Public library mobile apps in Scotland:
Views from the local authorities and the public. Library Hi Tech, 36(2), 237-251.
• Koufogiannakis, D., & Brettle, A. (Eds.) (2016). Being evidence based in library and
information practice. London: Facet.
• Nicholson, S. (2006). Approaching librarianship from the data: Using bibliomining for
evidence‐based librarianship. Library Hi Tech, 24(3), 369-375.
• Wilson, V. (2013). Formalized curiosity: Reflecting on the librarian practitioner-
researcher. Evidence Based Library and Information Practice, 8(1), 111-117.
• Wilson, V. (2016). Practitioner-researchers and EBLIP (pp. 81-91). In D.
Koufogiannakis and A. Brettle (Eds)., Being evidence based in library and information
practice. London: Facet.
• http://www.lirgjournal.org.uk – LIR, LIRG’s journal
• https://vle.cilip.org.uk – for accessing Intro to Research Skills
• https://pksb.cilip.org.uk – for assessing your own skills
References and further reading
Notas do Editor
Introduce myself and LIRG
CILIP is committed to evidence based practice – evidenced in a range of ways. Most recently – outcome of the research portal meeting