"In Search of Patterns at the Desk: An Analysis of Reference Question Tracking Logs" is being presented at the 4th QQML 2012 International Conference in Limerick, Ireland.
1. In Search of Patterns at the Desk:
An Analysis of Reference Question
Tracking Logs
2. Loyola Marymount University
Private Catholic
University in Los
Angeles, CA
5600+ undergraduates
and 1900+ graduates
William H. Hannon
Library opened in 2009
3. Reference Service at LMU
24/5 Information Desk
Staffed by students,
library staff, outsourced
staff
Desk encounters
recorded using Gimlet
question tracking system
14,210 volumes in the
print Reference
Collection
Over 200 Electronic
Databases
4. Purpose of Study
Content of questions (subject, difficulty level)
Content of answers (characteristics of
sources used, accuracy)
Patterns (by patron type, service provider,
subject, or time)
Develop reference question tagging scheme
5. Methodology
Content analysis of
LMU reference
questions, Fall
2010/Spring 2011
academic year
Excel file data dump,
deleted all non-
reference questions
and questions not
asked at Info Desk
6. Methodolgy
Took free text Q&A fields and recoded into
“Reference Tag,” “School/College,” “Subject,”
“Exact Source,” and “Quality”
New fields finalized after several rounds of
50-question sample calibrations and
“norming sessions” by 3 coders
7. Old Reference Tags (Beginning)
1 Citation Style
2 External Web Page
3 Known Item
4 Reference Book
5 Referral
6 Reserves
7 Retrieval
8 Search Construction
9 Topic Source
8. Final Revised Reference Tags
1 Catalog Use & Lookup
2 Database Help & Use
3 External Web Page
4 Internal Web Page
5 Reference Book (print)
6 Referral
7 Reserves
8 Retrieval
9 Other
11. Methodology
• Sampled from 3,422 total questions
• Random 20% sample from all questions at
levels 1-3 difficulty on READ (Reference Effort
Assessment Data) Scale
• All questions included from levels 4-6
• Total sample size=931 questions
12. Methodology
Analyzed sample in
SPSS to look at
frequencies and
relationships
Examined
standardized residuals
for significance
25. Accuracy: Student Worker Versus
Librarian
Database Use &
Lookup: students
recommend more
general sources
(Proquest, Academic
Search Complete,
OneSearch) versus
subject-specific
37. Colleges with Longer Questions
(16+ min.)
College Number of Questions
Business 48
Communication & Fine Arts 21
Education 6
Law 5
Liberal Arts 87
Science 10
38. Colleges Higher Difficulty Level
(Above “3”)
College Times
Business 63
Communication & Fine Arts 27
Education 14
Film & Television 6
Liberal Arts 148
Science 14
39. College of Liberal Arts: Subjects
with Higher Difficulty (Above “3”)
Subject Times
English 29
History 17
Philosophy 5
Psychology 15
Sociology 5
Theology 23
40. Patterns: Fall Versus Spring
Semester Total Questions:
CFA
Fall 32
Spring 17
Subject Fall Spring
English 29 11
Psychology 18 7
41. More Business Questions On
Monday
Day of Week Number of Questions
Sunday 7
Monday 23
Tuesday 18
Wednesday 15
Thursday 8
Friday 10
Saturday 6
42. More Theology Questions On
Tuesday
Day of Week Number of Questions
Sunday 5
Monday 11
Tuesday 23
Wednesday 13
Thursday 10
Friday 10
43. Limitations of Study
Interdisciplinary questions could not be
categorized by subject easily
Despite “norming” sessions coders
independently coded, so no interrater
reliability
Small sample size (20%) for first three
difficulty levels
Dependent on desk staff to accurately record
all stats
44. Key Findings: Collections
Print reference
collection used in only
5.9% of all questions
Small group of sources
used to answer
majority of ref.
questions: (29 unique
reference titles used
for 0.2% of all possible
titles)
46. Key Findings: Collections
24% of all reference questions required an
internal web page (LibGuide etc.) as a
source
50% of all reference questions required the
library catalog as a source
41% of all reference questions required a
database as a source
47. Key Findings: Staffing
More difficult/longer
reference questions
Oct.-Nov. and Mar.-April;
less difficult and shorter
in Sept.
Mon-Wed. between 2-
6pm should double-staff
the desk and have
librarian expertise; Sat.
is lighter
49. Key Findings: Databases
Business Source
Complete
Lexis Nexis
Good candidates for
Database workshops
based on frequency and
difficulty:
JSTOR
Proquest Vendor
Ebsco Vendor (show
Academic Search
Complete and
PsycINFO)
50. Key Findings: Subjects
Subject areas we serve the most at the Desk
(based on difficulty/volume):
Business
English
Psychology
Theology
History
Education
51. Key Findings: Methodology
For Reference tagging
scheme, source-based
approach worked
better than strategy-
based
52. Thank You to the Other Coders
Alexander Justice
Reference Librarian/Ref. Collection
Development Coordinator
Loyola Marymount University, Los Angeles
Email: ajustice@lmu.edu
Andrew Toot
Overnight Information Desk Supervisor
LAC/Loyola Marymount University, Los Angeles
Email: andrewtoot@gmail.com