SlideShare uma empresa Scribd logo
1 de 153
Baixar para ler offline
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
Collaborative Information Retrieval: Concepts, Models and
Evaluation
Lynda Tamine
Paul Sabatier University
IRIT, Toulouse - France
Laure Soulier
Pierre and Marie Curie University
LIP6, Paris - France
April 10, 2016
1 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
OVERVIEW OF THE RESEARCH AREA
c [Shah, 2012]
• Publications
Papers in several conferences (SIGIR, CIKM, ECIR, CHI, CSCW,...) and journals (IP&M,
JASIST, JIR, IEEE, ...)
Books on ”Collaborative Information Seeking”
[Morris and Teevan, 2009, Shah, 2012, Hansen et al., 2015]
Special issues on ”Collaborative Information Seeking” (IP&M, 2010; IEEE, 2014)
• Workshops and Tutorials
Collaborative Information Behavior: GROUP 2009
Collaborative Information Seeking: GROUP 2010, CSCW 2010, ASIST 2011 and CSCW 2013
Collaborative Information Retrieval: JCDL 2008 and CIKM 2011
Evaluation in Collaborative Information Retrieval: CIKM 2015
2 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• On which occasion do you collaborate?
Collaboration purposes
3 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• On which occasion do you collaborate?
Collaboration purposes
Task Frequency
Travel planing 27.5%
Online shopping 25.7%
Bibliographic search 20.2 %
Technical search 16.5 %
Fact-finding 16.5 %
Social event planing 12.8 %
Health search 6.4 %
3 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• On which occasion do you collaborate?
Collaboration purposes
Task Frequency
Travel planing 27.5%
Online shopping 25.7%
Bibliographic search 20.2 %
Technical search 16.5 %
Fact-finding 16.5 %
Social event planing 12.8 %
Health search 6.4 %
Application domains
3 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• On which occasion do you collaborate?
Collaboration purposes
Task Frequency
Travel planing 27.5%
Online shopping 25.7%
Bibliographic search 20.2 %
Technical search 16.5 %
Fact-finding 16.5 %
Social event planing 12.8 %
Health search 6.4 %
Application domains
Domain Example
Medical Physician/Patient - Physician/Nurse
Digital library Librarians/Customers
E-Discovery Fee-earners/Customers - Contact reviewer/Lead counsel
Academic groups of students
3 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• How do you collaborate?
How often?
4 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• How do you collaborate?
How often?
4 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• How do you collaborate?
How often? Group size?
4 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• How do you collaborate?
How often? Group size?
4 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• How do you collaborate?
How often? Group size?
Collaborative settings?
4 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013]
• How do you collaborate?
How often? Group size?
Collaborative settings?
22% 11.9% 66.1%
4 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
OUTLINE
1. Collaboration and Information Retrieval
2. Collaborative IR techniques and models
3. Evaluation
4. Challenges ahead
5. Discussion
5 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
PLAN
1. Collaboration and Information Retrieval
Users and Information Retrieval
The notion of collaboration
Collaboration paradigms
Collaborative search approaches
Collaborative search interfaces
2. Collaborative IR techniques and models
3. Evaluation
4. Challenges ahead
5. Discussion
6 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
AD-HOC INFORMATION RETRIEVAL
LET’S START BY WHAT YOU ALREADY KNOW...
• Ranking documents with respect to a query
• How?
Term weighting/Document scoring [Robertson and Walker, 1994, Salton, 1971]
Query Expansion/Reformulation [Rocchio, 1971]
7 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USERS AND INFORMATION RETRIEVAL
LET’S START BY WHAT YOU ALREADY KNOW...
• Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004]
Personalizing search results to user’s context, preferences
and interests
How?
Modeling user’s profile
Integrating the user’s context and preferences within the
document scoring
8 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USERS AND INFORMATION RETRIEVAL
LET’S START BY WHAT YOU ALREADY KNOW...
• Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004]
Personalizing search results to user’s context, preferences
and interests
How?
Modeling user’s profile
Integrating the user’s context and preferences within the
document scoring
• Collaborative filtering [Resnick et al., 1994]
Recommending search results using ratings/preferences
of other users
How?
Inferring user’s own preferences from other users’
preferences
Personalizing search results
8 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USERS AND INFORMATION RETRIEVAL
LET’S START BY WHAT YOU ALREADY KNOW...
• Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004]
Personalizing search results to user’s context, preferences
and interests
How?
Modeling user’s profile
Integrating the user’s context and preferences within the
document scoring
• Collaborative filtering [Resnick et al., 1994]
Recommending search results using ratings/preferences
of other users
How?
Inferring user’s own preferences from other users’
preferences
Personalizing search results
• Social Information Retrieval [Amer-Yahia et al., 2007, Pal and Counts, 2011]
Exploiting social media platforms to retrieve
document/users...
How?
Social network analysis (graph structure, information
diffusion, ...)
Integrating social-based features within the document
relevance scoring
8 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USERS AND INFORMATION RETRIEVAL
LET’S START BY WHAT YOU ALREADY KNOW...
• Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004]
Personalizing search results to user’s context, preferences
and interests
How?
Modeling user’s profile
Integrating the user’s context and preferences within the
document scoring
• Collaborative filtering [Resnick et al., 1994]
Recommending search results using ratings/preferences
of other users
How?
Inferring user’s own preferences from other users’
preferences
Personalizing search results
• Social Information Retrieval [Amer-Yahia et al., 2007, Pal and Counts, 2011]
Exploiting social media platforms to retrieve
document/users...
How?
Social network analysis (graph structure, information
diffusion, ...)
Integrating social-based features within the document
relevance scoring
Let’s have a more in-depth look on...
Collaborative Information Retrieval 8 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
DEFINITION
Definition
‘A process through which parties who see different aspects of a problem can constructively explore
their differences and search for solutions that go beyond their own limited vision of what is possible.”
[Gray, 1989]
Definition
‘Collaboration is a process in which autonomous actors interact through formal and informal
negotiation, jointly creating rules and struc- tures governing their relationships and ways to act or
decide on the issues that brought them together ; it is a process involving shared norms and mutually
beneficial interactions.” [Thomson and Perry, 2006]
9 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]
What?
Tasks: Complex, exploratory or fact-finding tasks, ...
Application domains: Bibliographic, medical, e-Discovery, academic search
10 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]
What?
Tasks: Complex, exploratory or fact-finding tasks, ...
Application domains: Bibliographic, medical, e-Discovery, academic search
Why?
Shared interests
Insufficient knowledge
Mutual beneficial goals
Division of labor
10 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]
What?
Tasks: Complex, exploratory or fact-finding tasks, ...
Application domains: Bibliographic, medical, e-Discovery, academic search
Why?
Shared interests
Insufficient knowledge
Mutual beneficial goals
Division of labor
Who?
Groups vs. Communities
10 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]
What?
Tasks: Complex, exploratory or fact-finding tasks, ...
Application domains: Bibliographic, medical, e-Discovery, academic search
Why?
Shared interests
Insufficient knowledge
Mutual beneficial goals
Division of labor
Who?
Groups vs. Communities
When?
Synchronous vs. Asynchronous
10 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]
What?
Tasks: Complex, exploratory or fact-finding tasks, ...
Application domains: Bibliographic, medical, e-Discovery, academic search
Why?
Shared interests
Insufficient knowledge
Mutual beneficial goals
Division of labor
Who?
Groups vs. Communities
When?
Synchronous vs. Asynchronous
Where?
Colocated vs. Remote
10 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010]
What?
Tasks: Complex, exploratory or fact-finding tasks, ...
Application domains: Bibliographic, medical, e-Discovery, academic search
Why?
Shared interests
Insufficient knowledge
Mutual beneficial goals
Division of labor
Who?
Groups vs. Communities
When?
Synchronous vs. Asynchronous
Where?
Colocated vs. Remote
How?
Crowdsourcing
Implicit vs. Explicit intent
User mediation
System mediation
10 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
COLLABORATIVE INFORMATION RETRIEVAL (CIR) [FOSTER, 2006, GOLOVCHINSKY ET AL., 2009]
11 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
COMPARING CIR WITH OTHER IR APPROACHES
Exercice
How do you think that CIR differs from Personalized IR, Collaborative Filtering, or Social IR?
• User (unique/group)
• Personalization (yes/no)
• Collaboration (implicit/explicit)
• Concurrency (collocated/remote)
• Collaboration benefit (symmetric/asymmetric)
• Communication (yes/no)
• ...
12 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
THE NOTION OF COLLABORATION
COMPARING CIR WITH OTHER IR APPROACHES
Exercice
How do you think that CIR differs from Personalized IR, Collaborative Filtering, or Social IR?
Perso. IR Collab. Filtering Social IR Collab. IR
User
unique
group
Personalization
no
yes
Collaboration
implicit
explicit
Concurrency
synchronous
asynchronous
Benefit
symmetric
asymmetric
Communication
no
yes
Information usage
Information exchange
Information retrieval
Information synthesis
Sensemaking
13 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION PARADIGMS [FOLEY AND SMEATON, 2010,
KELLY AND PAYNE, 2013, SHAH AND MARCHIONINI, 2010]
Division of labor • Role-based division of labor
• Document-based division of labor
14 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION PARADIGMS [FOLEY AND SMEATON, 2010,
KELLY AND PAYNE, 2013, SHAH AND MARCHIONINI, 2010]
Division of labor • Role-based division of labor
• Document-based division of labor
Sharing of knowledge • Communication and shared workspace
• Ranking based on relevance judgements
14 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATION PARADIGMS [FOLEY AND SMEATON, 2010,
KELLY AND PAYNE, 2013, SHAH AND MARCHIONINI, 2010]
Division of labor • Role-based division of labor
• Document-based division of labor
Sharing of knowledge • Communication and shared workspace
• Ranking based on relevance judgements
Awareness • Collaborators’ actions
• Collaborators’ context
14 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATIVE INFORMATION RETRIEVAL
COLLABORATIVE SEARCH SESSION
15 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
STRUCTURE OF THE COLLABORATIVE SEARCH SESSIONS
• The 3 phases
of the social
search model
[Evans and Chi, 2010]
16 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
STRUCTURE OF THE COLLABORATIVE SEARCH SESSIONS
• The 3 phases of the
collaborators
behavioral model
[Karunakaran et al., 2013]
17 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATIVE SEARCH APPROACHES [JOHO ET AL., 2009]
• “Development of new IR models that can take collaboration into account in retrieval.”
• “Leverage IR techniques such as relevance feedback, clustering, profiling, and data
fusion to support collaborative search while using conventional IR models.”
• “Develop search interfaces that allow people to perform search tasks in
collaboration.interfaces”
18 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATIVE SEARCH INTERFACES
What could be collaborative in search interfaces [Shah, 2012, Thomson and Perry, 2006]:
• Communication tools for defining search strategies, users’ roles as well as sharing relevant
information [Golovchinsky et al., 2011, Kelly and Payne, 2013]
• Awareness tools for reporting collaborators’ actions
[Diriye and Golovchinsky, 2012, Rodriguez Perez et al., 2011]
• Individual and shared workspace to ensure mutual beneficial goals
• Algorithmic mediation to monitor collaborators’ actions
19 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATIVE SEARCH INTERFACES
What could be collaborative in search interfaces [Shah, 2012, Thomson and Perry, 2006]:
• Communication tools for defining search strategies, users’ roles as well as sharing relevant
information [Golovchinsky et al., 2011, Kelly and Payne, 2013]
• Awareness tools for reporting collaborators’ actions
[Diriye and Golovchinsky, 2012, Rodriguez Perez et al., 2011]
• Individual and shared workspace to ensure mutual beneficial goals
• Algorithmic mediation to monitor collaborators’ actions
• User-driven collaborative interfaces
Collaborators fully active
Collaboration support through devices
(interactive tabletop) or tools (web interfaces)
• System-mediated collaborative interfaces
Collaborators partially active
Collaboration support through algorithmic
mediation (e.g., document distribution
according roles or not)
19 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATIVE SEARCH INTERFACES
USER-DRIVEN COLLABORATIVE INTERFACES
• Coagmento [Shah and Gonz´alez-Ib´a˜nez, 2011a]
20 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATIVE SEARCH INTERFACES
USER-DRIVEN COLLABORATIVE INTERFACES
• CoFox [Rodriguez Perez et al., 2011]
Others interfaces: [Erickson, 2010] [Vivian and Dinet, 2008]... 21 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATIVE SEARCH INTERFACES
USER-DRIVEN COLLABORATIVE INTERFACES
• TeamSearch [Morris et al., 2006]
Others interfaces: Fischlar-DiamondTouch [Smeaton et al., 2006] - WeSearch
[Morris et al., 2010]... 22 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATIVE SEARCH INTERFACES
SYSTEM-MEDIATED COLLABORATIVE INTERFACES
• Cerchiamo [Golovchinsky et al., 2008]
23 / 111
Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
COLLABORATIVE SEARCH INTERFACES
SYSTEM-MEDIATED COLLABORATIVE INTERFACES
• Querium [Diriye and Golovchinsky, 2012]
24 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
PLAN
1. Collaboration and Information Retrieval
2. Collaborative IR techniques and models
Challenges and issues
Understanding Collaborative IR
Overview
System-mediated CIR models
User-Driven System-mediated CIR models
Roadmap
3. Evaluation
4. Challenges ahead
5. Discussion
25 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
CHALLENGES
• Conceptual models of IR:
Static IR: system-based IR, does not learn from users
eg. VSM [Salton, 1971], BM25 [Robertson et al., 1995] LM [Ponte and Croft, 1998], PageRank
and Hits [Brin and Page, 1998]
26 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
CHALLENGES
• Conceptual models of IR:
Static IR: system-based IR, does not learn from users
eg. VSM [Salton, 1971], BM25 [Robertson et al., 1995] LM [Ponte and Croft, 1998], PageRank
and Hits [Brin and Page, 1998]
Interactive IR: exploiting feedback from users
eg. Rocchio [Rocchio, 1971], Relevance-based LM [Lavrenko and Croft, 2001]
26 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
CHALLENGES
• Conceptual models of IR:
Static IR: system-based IR, does not learn from users
eg. VSM [Salton, 1971], BM25 [Robertson et al., 1995] LM [Ponte and Croft, 1998], PageRank
and Hits [Brin and Page, 1998]
Interactive IR: exploiting feedback from users
eg. Rocchio [Rocchio, 1971], Relevance-based LM [Lavrenko and Croft, 2001]
Dynamic IR: learning dynamically from past user-system interactions and predicts future
eg. iPRP [Fuhr, 2008], interactive exploratory search [Jin et al., 2013]
26 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
CHALLENGES
• Conceptual models of IR:
27 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
CHALLENGES
• Conceptual models of IR:
27 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
CHALLENGES
28 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
CHALLENGES
1 Learning from user and user-user past interactions
2 Adaptation to multi-faceted and multi-user contexts: skills, expertise, role, etc.
3 Aggregating relevant information nuggets
4 Supporting synchronous vs. asynchronous coordination
5 Modeling collaboration paradigms: division of labor, sharing of knowledge
6 Optimizing the search cost: balance in work (search) and group benefit (task outcome)
29 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Objectives
1 Investigating user behavior and search patterns
Search processes [Shah and Gonz´alez-Ib´a˜nez, 2010, Yue et al., 2014]
Search tactics and practices [Hansen and J¨arvelin, 2005, Morris, 2008, Morris, 2013,
Amershi and Morris, 2008, Tao and Tombros, 2013, Capra, 2013]
Role assignement [Imazu et al., 2011, Tamine and Soulier, 2015]
2 Studying the impact of collaborative search settings on performance
Impact of collaboration on search performance
[Shah and Gonz´alez-Ib´a˜nez, 2011b, Gonz´alez-Ib´a˜nez et al., 2013]
30 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES
• Study objective: Testing the feasibility of the Kuhlthau’s model of the information
seking process in a collaborative information seeking situation
[Shah and Gonz´alez-Ib´a˜nez, 2010]
Stage Feeling Thoughts Actions
(Affective) (Cognitive)
Initiation Uncertainty General/Vague Actions
Selection Optimism
Exploration Confusion, Frustration, Doubt Seeking relevant informa-
tion
Formulation Clarity Narrowed, Clearer
Collection Sense of direction,
Confidence
Increased interest Seeking relevant or focused
information
Presentation Relief, Satisfaction or disap-
pointment
Clearer or focused
31 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES
• Study objective: Testing the feasibility of the Kuhlthau’s model in collaborative
information seeking situations [Shah and Gonz´alez-Ib´a˜nez, 2010]
Participants: 42 dyads, students or university employees who already did a collaborative work
together
System: Coagmento 1
Sessions: two sessions (S1, S2) running in 7 main phases: (1) tutorial on system, (2)
demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5)
post-questionnaire, (6) report compilation, (7) questionnaire and interview
Tasks: simulated work tasks.
eg. Task 1: Economic recession
”A leading newspaper has hired your team to create a comprehensive report on the causes and consequences
of the current economic recession in the US. As a part of your contract, you are required to collect all the
relevant information from any available online sources that you can find. ... Your report on this topic should
address the following issues: reasons behind this recession, effects on some major areas, such as health-care,
home ownership, and financial sector (stock market), unemployment statistics over a period of time, proposal
execution, and effects of the economy simulation plan, and people’s opinions and reactions on economy’s
downfall”
1
http://www.coagmento.org/
32 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES
• (Main) Study results:
The Kuhlthau’s model stages map collaborative tasks
• Initiation: number of chat
messages at the stage and
between stages
• Selection: number of chat
messages discussing the
strategy
• Exploration: number of
search queries
• Formulation: number of
visited webpages
• Collection: number of
collected webpages
• Presentation: number of
moving actions for
organizing collected
snippets
33 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES
• (Main) Study results:
The Kuhlthau’s model stages map collaborative tasks
• Initiation: number of chat
messages at the stage and
between stages
• Selection: number of chat
messages discussing the
strategy
• Exploration: number of
search queries
• Formulation: number of
visited webpages
• Collection: number of
collected webpages
• Presentation: number of
moving actions for
organizing collected
snippets
33 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: EXPLORING SEARCH TACTICS AND PRACTICES
• Study objective: Analyzing query (re)formulations and related term sources based on
participants’ actions [Yue et al., 2014]
Participants: 20 dyads, students who already knew each other in advance
System: Collabsearch
Session: one session running in running in 7 main phases: (1) tutorial on system, (2)
demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5)
post-questionnaire, (6) report compilation, (7) questionnaire and interview
Tasks: (T1) academic literature search, (T2) travel planning
34 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: EXPLORING SEARCH TACTICS AND PRACTICES
• (Main) Study results:
Individual action-based query reformulation (V, S, Q):
No (significant) new findings
Collaborative action-based query reformulation (SP, QP, C):
Influence of communication (C) is task-dependent.
Influence of collaborators’ queries (QP) is significantly higher than previous own queries (Q).
Less influence of collaborators’ workspace (SP) than own workspace (S).
• V: percentage of queries for which
participants viewed results, one
term originated from at least one
page
• S: percentage of queries for which
participants saved results, one term
originated from at least one page
• Q: percentage of queries with at
least one overlapping term with
previous queries
• SP: percentage of queries for which
at least one term originated from
collaborators’ workspace
• QP: percentage of queries for which
at least one term originated from
collaborators’ previous queries
• C: percentage of queries for which
at least one term originated from
collaborators’ communication
35 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: STUDYING ROLE ASSIGNMENT
• Study objective: Understanding differences in users’ behavior in role-oriented and
non-role- oriented collaborative search sessions
Participants: 75 dyads, students who already knew each other
Settings: 25 dyads without roles, 50 dyads with roles (25 PM roles, 25 GS roles)
System: open-source Coagmento plugin
Session: one session running in 7 main phases: (1) tutorial on system, (2) demographic
questionnaire, (3) task description, (4) timely-bounded task achievement, (5)
post-questionnaire, (6) report compilation, (7) questionnaire and interview
Tasks: Three (3) exploratory search tasks, topics from Interactive TREC track2
Tamine, L. and Soulier, L. (2015). Understanding the impact of the
role factor in collaborative information retrieval. In Proceedings of
the ACM International on Conference on Information and
Knowledge Management, CIKM 15, pages 4352.
2
http://trec.nist.gov/data/t8i/t8i.html
36 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: STUDYING ROLE ASSIGNMENT
• (Main) Study results
Users with assigned roles significantly behave differently than users with roles
Mean(s.d.)
npq dt nf qn ql qo nbm
W/Role
GS
Group 1.71(1.06) 9.99(3.37) 58.52(27.13) 65.91(31.54) 4.64(1.11) 0.44(0.18) 20(14.50)
IGDiffp
-0.52 -3.47*** 1.30*** 2.09*** 1.16*** 0.14*** 2.23***
PM
Group 1.88(1.53) 10.47(3.11) 56.31(27.95) 56.31(27.95) 2.79(0.70) 0.39(0.08) 15(12.88)
IGDiffp
0.24*** 1.45*** -2.42*** -1.69*** 0.06*** 0-0.23*** 0.05***
W/oRole
Group 2.09(1.01) 13.16(3.92) 24.13(12.81) 43.58(16.28) 3.67(0.67) 0.45(0.10) 19(11.34)
p-value/GS *** *** *** *** *** ***
p-value/PM *** *** *** *** *** *** *
W/Role
vs.
W/oRole
ANOVA p-val.
** *** ** *
37 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: STUDYING ROLE ASSIGNMENT
• (Main) Study results
Early and high level of coordination of participants without role
Role drift for participants with PM role
38 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: EVALUATING THE IMPACT OF COLLABORATION ON SEARCH PERFORMANCE
• Study objective: Evaluating the synergic effect of collaboration in information seeking
[Shah and Gonz´alez-Ib´a˜nez, 2011b]
Participants: 70 participants, 10 as single users, 30 as dyads
Settings: C1 (single users), C2 (artificial formed teams), C3 (co-located teams, different
computers), C4 (co-located teams, same computer), C5 remotely located teams
System: Coagmento
Session: one session running in running in 7 main phases: (1) tutorial on system, (2)
demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5)
post-questionnaire, (6) report compilation, (7) questionnaire and interview
Tasks: One exploratory search task, topic ”gulf oil spill”
39 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
GOAL: EVALUATING THE IMPACT OF COLLABORATION ON SEARCH PERFORMANCE
• (Main) Study results
Value of remote collaboration when the task has clear independent components
Remotely located teams able to leverage real interactions leading to synergic collaboration
Cognitive load in a collaborative setting not significantly higher than in an individual one
40 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Lessons learned
• Small-group (critical mass) collaborative search is a common practice despite the lack of
specific tools
41 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Lessons learned
• Small-group (critical mass) collaborative search is a common practice despite the lack of
specific tools
• The whole is greater than the sum of all
41 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Lessons learned
• Small-group (critical mass) collaborative search is a common practice despite the lack of
specific tools
• The whole is greater than the sum of all
• Collaborative search behavior differs from individual search behavior while some
phases of theoretical models of individual search are still valid for collaborative search
41 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Lessons learned
• Small-group (critical mass) collaborative search is a common practice despite the lack of
specific tools
• The whole is greater than the sum of all
• Collaborative search behavior differs from individual search behavior while some
phases of theoretical models of individual search are still valid for collaborative search
• Algorithmic mediation lowers the coordination cost
41 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Lessons learned
• Small-group (critical mass) collaborative search is a common practice despite the lack of
specific tools
• The whole is greater than the sum of all
• Collaborative search behavior differs from individual search behavior while some
phases of theoretical models of individual search are still valid for collaborative search
• Algorithmic mediation lowers the coordination cost
• Roles structure the collaboration but do not guarantee performance improvement in
comparison to no roles
41 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Lessons learned
• Small-group (critical mass) collaborative search is a common practice despite the lack of
specific tools
• The whole is greater than the sum of all
• Collaborative search behavior differs from individual search behavior while some
phases of theoretical models of individual search are still valid for collaborative search
• Algorithmic mediation lowers the coordination cost
• Roles structure the collaboration but do not guarantee performance improvement in
comparison to no roles
Design implications: revisit IR models and techniques
41 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Lessons learned
• Small-group (critical mass) collaborative search is a common practice despite the lack of
specific tools
• The whole is greater than the sum of all
• Collaborative search behavior differs from individual search behavior while some
phases of theoretical models of individual search are still valid for collaborative search
• Algorithmic mediation lowers the coordination cost
• Roles structure the collaboration but do not guarantee performance improvement in
comparison to no roles
Design implications: revisit IR models and techniques
• Back to the axiomatic relevance hypothesis (Fang et al. 2011)
41 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Lessons learned
• Small-group (critical mass) collaborative search is a common practice despite the lack of
specific tools
• The whole is greater than the sum of all
• Collaborative search behavior differs from individual search behavior while some
phases of theoretical models of individual search are still valid for collaborative search
• Algorithmic mediation lowers the coordination cost
• Roles structure the collaboration but do not guarantee performance improvement in
comparison to no roles
Design implications: revisit IR models and techniques
• Back to the axiomatic relevance hypothesis (Fang et al. 2011)
• Role as a novel variable in the IR models ?
41 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
EMPIRICAL UNDERSTANDING OF CIR
Lessons learned
• Small-group (critical mass) collaborative search is a common practice despite the lack of
specific tools
• The whole is greater than the sum of all
• Collaborative search behavior differs from individual search behavior while some
phases of theoretical models of individual search are still valid for collaborative search
• Algorithmic mediation lowers the coordination cost
• Roles structure the collaboration but do not guarantee performance improvement in
comparison to no roles
Design implications: revisit IR models and techniques
• Back to the axiomatic relevance hypothesis (Fang et al. 2011)
• Role as a novel variable in the IR models ?
• Learning to rank from user-system, user-user interactions within multi-session search
tasks?
41 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
OVERVIEW OF IR MODELS AND TECHNIQUES
DESIGNING COLLABORATIVE IR MODELS: A YOUNG RESEARCH AREA
42 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
OVERVIEW OF IR MODELS AND TECHNIQUES
DESIGNING COLLABORATIVE IR MODELS: A YOUNG RESEARCH AREA
42 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
OVERVIEW OF IR MODELS AND TECHNIQUES
Collaborative IR models are based on algorithmic mediation:
Systems re-use users’ search activity data to mediate the search
• Data?
Click-through data, queries, viewed results, result rankings, ...
User-user communication
• Mediation?
Rooting/suggesting/enhance the queries
Building personalized document rankings
Automatically set-up division of labor
43 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
OVERVIEW OF IR MODELS AND TECHNIQUES
Collaborative IR models are based on algorithmic mediation:
Systems re-use users’ search activity data to mediate the search
• Data?
Click-through data, queries, viewed results, result rankings, ...
User-user communication
• Mediation?
Rooting/suggesting/enhance the queries
Building personalized document rankings
Automatically set-up division of labor
43 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
OVERVIEW OF IR MODELS AND TECHNIQUES
Notations
Notation Description
d Document
q Query
uj User j
g Collaborative group
ti term i
RSV(d, q) Relevance Status Value given (d,q)
N Document collection size
ni Number of documents in the collection in which term ti occurs
R Number of relevant documents in the collection
Ruj
Number of relevant documents in the collection for user uj
r
uj
i Number of relevant documents of user uj in which term ti occurs
44 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
USER GROUP-BASED MEDIATION
• Enhancing collaborative search with users’ context
[Morris et al., 2008, Foley and Smeaton, 2009a, Han et al., 2016]
Division of labor: dividing the work by non-overlapping browsing
Sharing of knowledge: exploiting personal relevance judgments, user’s authority
45 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
USER/GROUP-BASED MEDIATION: GROUPIZATION, SMART SPLITTING, GROUP-HIGHLIGHTING [MORRIS ET AL., 2008]
• Hypothesis setting: one or a few synchronous search query(ies)
• 3 approaches
Smart splitting: splitting top ranked web results using a round-robin technique,
personalized-splitting of remaining results (document ranking level)
Groupization: reusing individual personalization techniques towards groups (document ranking
level)
Hit Highlighting: highlighting user’s keywords (document browsing level)
46 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
USER/GROUP-BASED MEDIATION: SMART-SPLITTING [MORRIS ET AL., 2008]
Personalizing the document ranking: use the revisited BM25 weighting scheme
[Teevan et al., 2005]
RSV(d, q, uj) =
ti∈d∩q
wBM25(ti, uj) (1)
wB2M5(ti, uj) = log
(ri + 0.5)(N − ni − Ruj + r
uj
i + 0.5)
(ni − r
uj
i + 0.5)(Ruj − r
uj
i + 0.5
(2)
N = (N + Ruj ) (3)
ni = ni + r
uj
i (4)
47 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
USER/GROUP-BASED MEDIATION: SMART-SPLITTING [MORRIS ET AL., 2008]
Example
Smart-splitting according to personalized scores.
48 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
USER/GROUP-BASED MEDIATION: COLLABORATIVE RELEVANCE FEEDBACK [FOLEY ET AL., 2008, FOLEY AND SMEATON, 2009B]
• Hypothesis setting: multiple independent synchronous search queries
• Collaborative relevance feedback: sharing collaborator’s explicit relevance judgments
Aggregate the partial user relevance scores
Compute the user’s authority weighting
49 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
USER/GROUP-BASED MEDIATION: COLLABORATIVE RELEVANCE FEEDBACK [FOLEY ET AL., 2008, FOLEY AND SMEATON, 2009B]
• A: Combining inputs of the RF process
puwo(ti) =
U−1
u=0
ruiwBM25(ti) (5)
wBM25(ti) = log
( U−1
u=0 αu
ru
i
Ru
)(1 − U−1
u=0 αu
ni − rui
N − Ru
)
( U−1
u=0 αu
ni − rui
N − Ru
)(1 − U−1
u=0 αu
rui
Ru
)
(6)
U−1
u=0
αu = 1 (7)
50 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
USER/GROUP-BASED MEDIATION: COLLABORATIVE RELEVANCE FEEDBACK [FOLEY ET AL., 2008, FOLEY AND SMEATON, 2009B]
• B: Combining outputs of the RF process
crwo(ti) =
U−1
u=0
αuwBM25(ti, u) (8)
wBM25(ti, u) = log
(
ru
i
Ru
)(1 −
ni − rui
N − Ru
)
(
ni − rui
N − Ru
)(1 −
rui
Ru
)
(9)
• C: Combining outputs of the ranking process
RSV(d, q) =
U−1
u=0
αuRSV(d, q, u) (10)
RSV(d, q, u) =
ti∈d∩q
wBM25(ti, u) (11)
51 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
USER/GROUP-BASED MEDIATION: CONTEXT-BASED COLLABORATIVE SEARCH [HAN ET AL., 2016]
• Exploit a 3-dimensional context:
Individual search history HQU: queries, results, bookmarks etc.)
Collaborative group HCL: collaborators’ search history (queries, results, bookmarks etc.)
Collaboration HCH: collaboration behavior chat (communication)
52 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
USER/GROUP-BASED MEDIATION: CONTEXT-BASED COLLABORATIVE SEARCH [HAN ET AL., 2016]
1 Building a document ranking RSV(q, d) and generating Rank(d)
2 Building the document language model θd
3 Building the context language model θHx
p(ti|Hx) =
1
K
K
k=1
p(ti|Xk) (12)
p(ti|Xk) =
nk
Xk
(13)
4 Computing the KL-divergence between θHx and θd
D(θd, θHx ) = −
ti
p(ti|θd) log p(ti|Hx) (14)
5 Learning to rank using pairwise features (Rank(d), D(θd, θHx))
53 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION
Enhancing collaborative search with user’s role
[Pickens et al., 2008, Shah et al., 2010, Soulier et al., 2014b]
• Division of labour: dividing the work based on users’ role peculiarities
• Sharing of knowledge: splitting the search results
54 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: PROSPECTOR AND MINER [PICKENS ET AL., 2008]
• Prospector/Miner as functional roles supported by algorithms:
Prospector: ”..opens new fields for exploration into a data collection..”.
→ Draws ideas from algorithmically suggested query terms
Miner: ”..ensures that rich veins of information are explored...”.
→ Refines the search by judging highly ranked (unseen) documents
• Collaborative system architecture:
Algorithmic layer: functions
combining users’ search activities to
produce fitted outcomes to roles
(queries, document rankings).
Regulator layer: captures inputs
(search activities), calls the
appropriate functions of the
algorithmic layer, roots the outputs
of the algorithmic layer to the
appropriate role (user).
55 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: PROSPECTOR AND MINER [PICKENS ET AL., 2008]
• Prospector function: The highly-relevant terms are suggested based on:
Score(ti) =
Lq∈L
wr(Lq)wf (Lq)rlf(ti; Lq) (15)
rlf(ti; Lq): number of documents in Lq in which ti occurs.
• Miner function: The unseen documents are queued according to
RSV(q, d) =
Lq∈L
wr(Lk)wf (Lq)borda(d; Lq) (16)
wr(Lq) =
|seen ∈ Lq|
|seen ∈ Lq|
(17)
wf (Lq) =
|rel ∈ Lq|
|seen ∈ Lq|
(18)
56 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: GATHERER AND SURVEYOR [SHAH ET AL., 2010]
• Gatherer/Surveyor as functional roles supported by algorithms:
Gatherer: ”..scan results of joint search activity to discover most immediately relevant documents..”.
Surveyor: ”..browse a wider diversity of information to get a better understanding of the collection
being searched...”.
• Main functions:
Merging: merging (eg. CombSum) the
documents rankings of collaborators
Splitting: rooting the appropriate
documents according to roles (eg.
k-means clustering). High precision for
the Gatherer, high diversity for the
Surveyor
57 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE
Domain expert/Domain novice as knowledge-based roles supported by algorithms:
• Domain expert: ”..represent problems at deep structural levels and are generally interested in
discovering new associations among different aspects of items, or in delineating the advances in
a research focus surrounding the query topic..”.
• Domain novice: ”..represent problems in terms of surface or superficial aspects and are
generally interested in enhancing their learning about the general query topic..”.
Soulier, L., Tamine, L., and Bahsoun, W. (2014b). On domain
expertise-based roles in collaborative information retrieval.
Information Processing & Management (IP&M), 50(5):752774.
58 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]
A two step algorithm:
1 Role-based document relevance scoring
Pk
(d|uj, q) ∝ Pk(uj|d) · Pk(d|q) (19)
P(q|θd) ∝ (ti,wiq)∈q[λP(ti|θd) + (1 − λ)P(ti|θC)]wiq (20)
Pk
(uj|d) ∝ P(π(uj)k|θd)
∝ (ti,wk
ij
)∈π(uj)k [λk
dj
P(ti|θd) + (1 − λk
dj
)P(ti|θC)]
wk
ij (21)
59 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]
A two step algorithm:
1 Role-based document relevance scoring : parameter smoothing using evidence from
novelty and specificity
λk
dj =
Nov(d, D(uj)k) · Spec(d)β
maxd ∈D Nov(d, D(uj)k) · Spec(d )β
(22)
with β
1 if uj is an expert
−1 if uj is a novice
Novelty
Nov(d, D(uj)
k
) = mind ∈D(uj)k d(d, d ) (23)
Specificity
Spec(d) = avgti∈dspec(ti) = avgti∈d(
−log(
fdti
N )
α
) (24)
60 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]
A two step algorithm:
1 Document allocation to collaborators
Classification-based on the Expectation Maximization algorithm (EM)
E-step: Document probability of belonging to collaborator’s class
P(Rj = 1|x
k
dj) =
αk
j · φk
j (xk
dj)
αk
j
· φk
j
(xk
dj
) + (1 − αk
j
) · ψk
j
(xk
dj
)
(25)
M-step : Parameter updating and likelihood estimation
Document allocation to collaborators by comparison of document ranks within collaborators’
lists
r
k
jj (d, δ
k
j , δ
k
j ) =
1 if rank(d, δk
j ) < rank(d, δk
j
)
0 otherwise
(26)
Division of labor: displaying distinct document lists between collaborators
61 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]
Example
Applying the Expert/Novice CIR model
Let’s consider:
• A collaborative search session with two users u1 (expert) and u2 (novice).
• A shared information need I modeled through a query q.
• A collection of 10 documents and their associated relevance score with respect to the
shared information need I.
t1 t2 t3 t4
q 1 0 1 0
d1 2 3 1 1
d2 0 0 5 3
d3 2 1 7 6
d4 4 1 0 0
d5 2 0 0 0
d6 3 0 0 0
d7 7 1 1 1
d8 3 3 3 3
d9 1 4 5 0
d10 0 0 4 0
Weighting vectors of documents and query:
q = (0.5, 0, 0.5, 0) ;
d1 = (0.29, 0.43, 0.14, 0.14)
d2 = (0, 0, 0.63, 0.37)
d3 = (0.12, 0.06, 0.44, 0.28)
d4 = (0.8, 0.2, 0, 0)
d5 = (1, 0, 0, 0)
d6 = (0.3, 0, 0, 0.7)
d7 = (0.7, 0.1, 0.1, 0.1)
d8 = (0.25, 0.25, 0.25, 0.25)
d9 = (0.1, 0.4, 0.5, 0)
d10 = (0, 0, 1, 0).
Users profile: π(u1)0 = π(u2)0 = q
62 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]
Example
Applying the Expert/Novice CIR model
RSV(q, d) rank(d) Spec(d)
d1 0.24 2 0.19
d2 0.02 7 0.23
d3 0.17 3 0.19
d4 0.03 6 0.15
d5 0.01 9 0.1
d6 0.02 8 0.1
d7 0.10 4 0.19
d8 0.31 1 0.19
d9 0.09 5 0.16
d10 0.01 10 0.15
• The document specificity is estimated as:
α = 3 (If a term has a collection frequency equals to 1, −log(1/10) = 2.30)
d1 =
−log( 8
10
)
3
−log( 6
10
)
3
−log( 7
10
)
3
−log( 5
10
)
3
4 = 0.19
d2 = 0.23, d3 = 0.19, d4 = 0.15, d5 = 0.01, d6 = 0.1, d7 = 0.19, d8 = 0.19, d9 = 0.16,
d10 = 0.15
• Iteration 0: Distributing top (6) documents to users: 3 most specific to the expert and
the 3 less specific to the novice.
Expert u1: l0
(u1, D0
ns) = {d8, d1, d3}
Novice u2: l0
(u2, D0
ns) = {d7, d9, d4}
63 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]
Example
Applying the Expert/Novice CIR model
• Iteration 1. Let’s consider that user u2 selected document d4 (D(u1)1 = {d4, d5}).
Building the user’s profile.
π(u1)1
= (0.5, 0, 0.5, 0)
π(u2)1
= ( 0.5+0.8
2 , 0.2
2 , 0.5
2 , 0) = (0.65, 0.1, 0.25, 0).
Estimating the document relevance with respect to collaborators.
For user u1 : P1
(d1|u1) = P1
(d1|q) ∗ P1
(u1|d1) = 0.24 ∗ 0.22 = 0.05.
P1
(d1|q) = 0.24.
P1
(u1|d1) = (0.85 ∗ 2
7
+ 0.15 ∗ 24
84
)0.05
+ (0.85 ∗ 3
7
+ 0.15 ∗ 13
84
)0
+ (0.85 ∗ 1
7
+ 0.15 ∗ 26
84
)0.05
+
(0.85 ∗ 1
7
+ 0.15 ∗ 21
84
)0
= 0.22
λ1
11 = 1∗0.19
0.23
= 0.85 where 0.19 expresses the specificity of document d1 and 1 is the document
novelty score, and 0.23 the normalization score.
The normalized
document scores
for each
collaborators are
the following:
P1
(d|u1) P2
(d|u2)
d1 0.23 0.28
d2 0 0.03
d3 0.16 0.11
d5 0.01 0.01
d6 0.03 0.02
d7 0.12 0.14
d8 0.34 0.34
d9 0.10 0.06
d10 0.01 0.01 64 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
SYSTEM-MEDIATED CIR MODELS
ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B]
Example
Applying the Expert/Novice CIR model
• Iteration 1. Let’s consider that user u2 selected document d4 (D(u1)1 = {d4, d5}).
Building the user’s profile.
π(u1)1
= (0.5, 0, 0.5, 0)
π(u2)1
= ( 0.5+0.8
2 , 0.2
2 , 0.5
2 , 0) = (0.65, 0.1, 0.25, 0).
Estimating the document relevance with respect to collaborators.
For user u1 : P1
(d1|u1) = P1
(d1|q) ∗ P1
(u1|d1) = 0.24 ∗ 0.22 = 0.05. P1
(d1|q) = 0.24 since that the
user’s profile has not evolve.
λ1
11 = 1∗0.19
0.23
= 0.85 where 0.19 expresses the specificity of document d1 and 1 is the document
novelty score, and 0.23 the normalization score.
P1
(u1|d1) = (0.85 ∗ 2
7
+ 0.15 ∗ 24
84
)0.05
+ (0.85 ∗ 3
7
+ 0.15 ∗ 13
84
)0
+ (0.85 ∗ 1
7
+ 0.15 ∗ 26
84
)0.05
+
(0.85 ∗ 1
7
+ 0.15 ∗ 21
84
)0
= 0.22
The normalized
document scores
for each
collaborators are
the following:
P1
(d|u1) P2
(d|u2)
d1 0.23 0.28
d2 0 0.03
d3 0.16 0.11
d5 0.01 0.01
d6 0.03 0.02
d7 0.12 0.14
d8 0.34 0.34
d9 0.10 0.06
d10 0.01 0.01 65 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USER-DRIVEN SYSTEM-MEDIATED CIR MODELS
MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH
Soulier, L., Shah, C., and Tamine, L. (2014a). User-driven
System-mediated Collaborative Information Retrieval. In
Proceedings of the Annual International SIGIR Conference on
Research and Development in Information Retrieval, SIGIR 14,
pages 485494. ACM.
66 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USER-DRIVEN SYSTEM-MEDIATED CIR MODELS
MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]
• Identifying users’ search behavior differences: estimating significance of differences
using the Kolmogrov-Smirnov test
• Characterizing users’ role
67 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USER-DRIVEN SYSTEM-MEDIATED CIR MODELS
MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]
• Categorizing users’ roles Ru
argmin R1,2
||FR1,2 C
(tl)
u1,u2
|| (27)
subject to :
∀
(fj,fk)∈K
R1,2 FR1,2 (fj, fk) − C
(tl)
u1,u2
(fj, fk)) > −1
where defined as:
FR1,2 (fj, fk) C
(tl)
u1,u2
(fj, fk) =
FR1,2 (fj, fk) − C
(tl)
u1,u2
(fj, fk) if FR1,2 (fj, fk) ∈ {−1; 1}
0 otherwise
• Personalizing the search: [Pickens et al., 2008, Shah, 2011]...
68 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USER-DRIVEN SYSTEM-MEDIATED CIR MODELS
MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]
• User’s roles modeled through patterns
Intuition
Number of visited documents
Number of submitted queries
Negative correlation
Role pattern PR1,2
Search feature kernel KR1,2
Search feature-based correlation matrix FR1,2
F
R1,2
=



1 if positively correlated
−1 if negatively correlated
0 otherwise
69 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USER-DRIVEN SYSTEM-MEDIATED CIR MODELS
MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]
Example
Mining role of collaborators
A collaborative
search session
implies two users
u1 and u2 aiming
at identifying
information
dealing with
“global warming”.
We present search
actions of
collaborators for
the 5 first minutes
of the session.
u t actions additional information
u2 0 submitted query “global warming”
u1 1 submitted query “global warming”
u2 8 document d1: visited comment: “interesting”
u2 12 document d2: visited
u2 17 document d3: visited rated: 4/5
u2 19 document d4: visited
u1 30 submitted query “greenhouse effect”
u1 60 submitted query “global warming definition”
u1 63 document d20: visited rated: 3/5
u1 70 submitted query “global warming protection”
u1 75 document d21: visited
u2 100 document d5: visited rated: 5/5
u2 110 document d6: visited rated: 4/5
u2 120 document d7: visited
u1 130 submitted query “gas emission”
u1 132 document d22: visited rated: 4/5
u2 150 document d8: visited
u2 160 document d9: visited
u2 170 document d10: visited
u2 200 document d11: visited comment: “great”
u2 220 document d12: visited
u2 240 document d13: visited
u1 245 submitted query “global warming world protection”
u1 250 submitted query “causes temperature changes”
u1 298 submitted query “global warming world politics” 70 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USER-DRIVEN SYSTEM-MEDIATED CIR MODELS
MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]
Example
Mining role of collaborators: matching with role patterns
• Role patterns
Roles of reader-querier
F
Rread,querier =
1 −1
−1 1
, K
Rread,querier = {(Nq, Np)}
Role : (S
(tl)
u1
, S
(tl)
u2
, Rread,querier) → {(reader, querier), (querier, reader)}
(S
(tl)
u1
, S
(tl)
u2
, Rread,querier) →
(reader, querier) if S
(tl)
u1
(tl, Np) > S
(tl)
u2
(tl, Np)
(querier, reader) otherwise
Role of judge-querier
F
Rjudge,querier =
1 −1
−1 1
, K
Rjudge,querier = {(Nq, Nc)}
Role : (S
(tl)
u1
, S
(tl)
u2
, Rjudge,querier → {(judge, querier), (querier, judge)}
(S
(tl)
u1
, S
(tl)
u2
, Rjudge,querier) →
(judge, querier) if S
(tl)
u1
(tl, Nc) > S
(tl)
u2
(tl, Nc)
(querier, judge) otherwise
71 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USER-DRIVEN SYSTEM-MEDIATED CIR MODELS
MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]
Example
Mining role of collaborators
• Track users’ behavior each 60 seconds
• F = {Nq, Nd, Nc, Nr}, respectively, number of queries, documents, comments, ratings.
• Users’ search behavior
S
(300)
u1
=





3 0 0 0
4 2 0 1
5 3 0 2
5 3 0 2
8 3 0 2





S
(300)
u2
=





1 4 1 1
1 7 1 3
1 10 1 3
1 13 2 3
1 13 2 3





• Collaborators’ search differences (matrix and Kolmogorov-Smirnov test)
∆
(300)
u1,u2
=





2 −4 −1 −1
3 −5 −1 −2
4 −7 −1 −1
4 −10 −2 −1
7 −10 −2 −1





- Number of queries : p
(tl)
u1,u2
(Nq) = 0.01348
- Number of pages : p
(tl)
u1,u2
(Nd) = 0.01348
- Number of comments : p
(tl)
u1,u2
(Nc) = 0.01348
- Number of ratings : p
(tl)
u1,u2
(Nr) = 0.08152
72 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
USER-DRIVEN SYSTEM-MEDIATED CIR MODELS
MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A]
Example
Mining role of collaborators: matching with role patterns
• Collaborators’ search action complementarity: correlation matrix between search
differences
C
(300)
u1,u2
=



1 −0.8186713 −0.731925 0
−0.8186713 1 0.9211324 0
−0.731925 0.9211324 1 0
0 0 0 0



• Role mining: comparing the role pattern with the sub-matrix of collaborators’
behaviors
Role of reader-querier
||F
Rread,querier C
(300)
u1,u2
|| =
0 −1 − (−0.8186713)
−1 − (−0.8186713) 0
=
0 0.183287
0.183287 0
The Frobenius norm is equals to:
√
0.1832872 = 0.183287.
Role of judge-querier
||F
Rjudge,querier C
(300)
u1,u2
|| =
0 −1 − (−0.731925)
−1 − (−0.731925) 0
=
0 0.268174
0.268174 0
The Frobenius norm is equals to:
√
0.2681742 = 0.268174.
→ Collaborators acts as reader/querier with u1 labeled as querier and u2 as reader (highest
Np).
73 / 111
1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion
OVERVIEW OF IR MODELS AND TECHNIQUES
[FoleyandSmeaton,2009a]
[Morrisetal.,2008]“smart-splitting”
[Morrisetal.,2008]“groupization”
[Pickensetal.,2008]
[Shahetal.,2010]
[Soulieretal.,IP&M2014b]
[Soulieretal.,SIGIR2014a]
Relevance
collective
individual
Evidence source
feedback
interest
expertise
behavior
role
Paradigm
division of labor
sharing of knowledge
74 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PLAN
1. Collaboration and Information Retrieval
2. Collaborative IR techniques and models
3. Evaluation
Evaluation challenges
Protocols
Protocols
Protocols
Metrics and ground truth
Baselines
Tools and datasets
4. Challenges ahead
5. Discussion
75 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
EVALUATION CHALLENGES
• Learning from user and user-user past
interactions
• Adaptation to multi-faceted and multi-user
contexts: skills, expertise, role, etc
• Aggregating relevant information nuggets
Evaluating the collective relevance
• Supporting synchronous vs. asynchronous
coordination
• Modeling collaboration paradigms: division of
labor, sharing of knowledge
• Optimizing search cost: balance in work (search)
and group benefit (task outcome)
Measuring the collaborative
effectiveness
76 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
CATEGORIES OF PROTOCOLS
• Standard evaluation frameworks
Without humans: batch-based evaluation (TREC, INEX, CLEF, ...)
77 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
CATEGORIES OF PROTOCOLS
• Standard evaluation frameworks
Without humans: batch-based evaluation (TREC, INEX, CLEF, ...)
With humans in the process (recommended)
c [Dumais, 2014]
78 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
CATEGORIES OF PROTOCOLS
• Standard evaluation frameworks
Without humans: batch-based evaluation (TREC, INEX, CLEF, ...)
With humans in the process (recommended)
• CIR-adapted evaluation frameworks
79 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
BATCH: COLLABORATION SIMULATION [MORRIS ET AL., 2008, SHAH ET AL., 2010]
• Real users formulating queries w.r.t. the shared information need
15 individual users asked to list queries they would associate to 10 TREC topics. Then, pairs
of collaborators are randomly built [Shah et al., 2010]
10 groups of 3 participants asked to list collaboratively 6 queries related to the information
need [Morris et al., 2008]
• Simulating the collaborative rankings on the participants’ queries
80 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
BATCH: COLLABORATION SIMULATION [MORRIS ET AL., 2008, SHAH ET AL., 2010]
• Real users formulating queries w.r.t. the shared information need
15 individual users asked to list queries they would associate to 10 TREC topics. Then, pairs
of collaborators are randomly built [Shah et al., 2010]
10 groups of 3 participants asked to list collaboratively 6 queries related to the information
need [Morris et al., 2008]
• Simulating the collaborative rankings on the participants’ queries
Advantages:
• Larger number of experimental tests
(parameter tuning, more baselines, ...)
• Less costly and less time consuming
than user studies
Limitations:
• Small manifestation of the collaborative
aspects
• No span of the collaborative search
session
• Difficult to evaluate the generalization of
findings
80 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]
• Individual search logs (from user studies or official benchmarks)
81 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]
• Individual search logs (from user studies or official benchmarks)
• Chronological synchronization of individual search actions
81 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]
• Individual search logs (from user studies or official benchmarks)
• Chronological synchronization of individual search actions
• Simulating the collaborative rankings on the users’ queries
81 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]
• Individual search logs (from user studies or official benchmarks)
• Chronological synchronization of individual search actions
• Simulate the collaborative rankings on the users’ queries
82 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B]
• Individual search logs (from user studies or official benchmarks)
• Chronological synchronization of individual search actions
• Simulate the collaborative rankings on the users’ queries
Advantages:
• Modeling of a collaborative session
• Larger number of experimental tests
(parameter tuning, more baselines, ...)
• Less costly and less time consuming
than user studies
Limitations:
• Any manifestation of the collaborative
aspects
• Difficult to evaluate the generalization of
findings
82 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
LOG-STUDIES: COLLABORATIVE SEARCH LOGS [SOULIER ET AL., 2014A]
• Real logs of collaborative search sessions
• CIR ranking model launched on the participant queries
83 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
LOG-STUDIES: COLLABORATIVE SEARCH LOGS [SOULIER ET AL., 2014A]
• Real logs of collaborative search sessions
• CIR ranking model launched on the participant queries
Advantages:
• A step forward to realistic collaborative
scenarios
• Queries resulting from a collaborative
search process
Limitations:
• Costly and time-consuming, unless
available data
• Implicit feedback on the retrieved
document lists
83 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
USER-STUDIES [PICKENS ET AL., 2008]
• Real users performing the collaborative task
• CIR models launched in real time in response to users’ actions
84 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
PROTOCOLS
USER-STUDIES [PICKENS ET AL., 2008]
• Real users performing the collaborative task
• CIR models launched in real time in response to users’ actions
Advantages:
• One of the most realistic scenario
(instead of panels)
Limitations:
• Costly and time-consuming
• Controlled tasks in laboratory
84 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
CATEGORIES OF METRICS
Evaluation Objectives in collaborative search
• Measuring the retrieval effectiveness of the ranking models
• Measuring the search effectiveness of the collaborative groups
• Measuring collaborators’ satisfaction and cognitive effort
• Analyzing collaborators’ behavior
• User-driven metrics/indicators aiming
at evaluating:
The collaborators’ awareness and
satisfaction [Aneiros and Morris, 2003,
Smyth et al., 2005]
The cognitive effort
The search outcomes
• System-oriented metrics/indicators
aiming at evaluating:
The retrieval effectiveness of the ranking
models
The insurance of the collaborative
paradigms of the ranking models
(division of labor)
The collaborative relevance of
documents ( → ground truth)
85 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
USER-DRIVEN METRICS
• Search log analysis
Behavioral analysis: collaborators’ actions [Tamine and Soulier, 2015]
Feature Description
npq Average number of visited pages by query
dt Average time spent between two visited pages
nf Average number of relevance feedback information (snippets, annotations
& bookmarks)
qn Average number of submitted queries
ql Average number of query tokens
qo Average ratio of shared tokens among successive queries
nbm Average number of exchanged messages within the search groups
86 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
USER-DRIVEN METRICS
• Search log analysis
Behavioral analysis: collaborators’ actions [Tamine and Soulier, 2015]
Feature Description
npq Average number of visited pages by query
dt Average time spent between two visited pages
nf Average number of relevance feedback information (snippets, annotations
& bookmarks)
qn Average number of submitted queries
ql Average number of query tokens
qo Average ratio of shared tokens among successive queries
nbm Average number of exchanged messages within the search groups
Behavioral analysis: communication channels
[Gonz´alez-Ib´a˜nez et al., 2013, Strijbos et al., 2004]
c
86 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
USER-DRIVEN METRICS
• Search log analysis
Behavioral analysis: collaborators’ actions and communication channels
Search outcomes [Shah, 2014]
c
Evidence sources Description
Visit. doc. Rel. doc. Dwell-time Number of visits
(Unique) Coverage (unique) visited webpages
Likelihood of discovery number of visits-based IDF metric
(Unique) Useful pages (unique) number of useful pages
(visited more than 30 seconds)
Precision number of distinct relevant and vis-
ited pages over the number of dis-
tinct visited pages
Recall number of distinct relevant and vis-
ited pages over the number of dis-
tinct relevant pages
F-measure Combinaison of precision and recall
87 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
USER-DRIVEN METRICS
Exercice
Estimating the search outcome effectiveness of a collaborative search session (Coverage, Relevant
Coverage, Precision, Recall, F-measure).
• Let’s consider:
a collaborative search session involving two users u1 and u2 aiming at solving an information
need I.
During the session, u1 selected the following documents: {d1, d2, d6, d9, d17, d20}
During the session, u2 selected the following documents: {d3, d4, d5, d6, d7}
a collection of 20 documents D = {d ; i = 1, ·, 20},
a ground truth for the information need I: GTI = {d2, d6, d15}
88 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
USER-DRIVEN METRICS
Exercice
Estimating the search outcome effectiveness of a collaborative search session (Coverage, Relevant
Coverage, Precision, Recall, F-measure).
• Let’s consider:
a collaborative search session involving two users u1 and u2 aiming at solving an information
need I.
During the session, u1 selected the following documents: {d1, d2, d6, d9, d17, d20}
During the session, u2 selected the following documents: {d3, d4, d5, d6, d7}
a collection of 20 documents D = {d ; i = 1, ·, 20},
a ground truth for the information need I: GTI = {d2, d6, d15}
• Evaluation metrics:
UniqueCoverage(g) = {d1, d2, d3, d4, d5, d6, d7, d9, d17, d20}.
RelevantCoverage(g) = {d2, d6}.
Precision(g) = 2
10 = 0.2
Recall(g) = 2
3 = 0.66
F − measure(g) = 2·0.2·0.66
0.2+0.66 = 0.33
88 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
USER-DRIVEN METRICS
• Questionnaires and interviews
The “TLX instrument form”: measuring the cognitive effort
c
89 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
USER-DRIVEN METRICS
• Questionnaires and interviews
The “TLX instrument form”: measuring the cognitive effort
Satisfaction interviews [Shah and Gonz´alez-Ib´a˜nez, 2011a, Tamine and Soulier, 2015]
Question Answer type
Have you already participated in such user
study? If yes, please describe it.
Free-answer
What do you think about this collaborative man-
ner of seeking information?
Free-answer
What was the level of difficulty of the task? a) Easy (Not difficult) b) Moder-
ately difficult c) Difficult
What was task difficulty related to? Free-answer
Could you say that the collaborative system sup-
ports your search?
a) Yes b) Not totally c) Not at all
How could we improve this system? Free-answer
89 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
SYSTEM-ORIENTED METRICS [SOULIER ET AL., 2014A]
• The precision Prec@R(g) at rank R of a collaborative group g:
Prec@R(g) = 1
T(g)
|T(g)|
t=1 Prec@R(g)(t) = 1
T(g)
|T(g)|
t=1
RelCov@R(g)(t)
Cov@R(g)(t) (28)
• The recall Recall@R(g) at rank R of group g:
Recall@R(g) = 1
T(g)
|T(g)|
t=1 Recall@R(g)(t) = 1
T(g)
|T(g)|
t=1
RelCov@R(g)(t)
|RelDoc|
(29)
• The F-measure Fsyn@R(g) at rank R of a collaborative group g:
F@R(g) =
1
T(g)
|T(g)|
t=1
2 ∗ Prec@R(g)(t) ∗ Recall@R(g)(t)
Prec@R(g)(t) + Recall@R(g)(t)
(30)
90 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
SYSTEM-ORIENTED METRICS AND GROUND TRUTH
Example
Estimating the retrieval effectiveness of the rankings of CIR models (Coverage, Relevant Coverage,
Precision, Recall, F-measure).
Ground truth GTI = {d2, d6, d15}
Query Document ranking
q1 d1, d2, d3
q2 d2, d8, d14
q3 d17, d3, d8
q4 d9, d15, d2
q5 d1, d5, d3
q6 d20, d3, d1
q7 d5, d2, d4
91 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
SYSTEM-ORIENTED METRICS AND GROUND TRUTH
Example
Estimating the retrieval effectiveness of the rankings of CIR models.
Evaluation metrics:
Query pairs Coverage Relevant Coverage Precision Recall F-measure
q1-q2 d1, d2, d3, d8, d14 d2
1
5
1
3
0.25
q2-q3 d2, d8, d14, d17, d3 d2
1
5
1
3
0.25
q3-q4 d17, d3, d8, d9, d15 d15
1
5
1
3
0.25
q3-q7 d17, d3, d8, d5, d2, d4 d2
1
6
1
3
0.22
q5-q7 d1, d3, d5, d2, d4 - 0 0 0
q6-q7 d20, d3, d1, d5, d2, d4 d2
1
6
1
3
0.22
Average 0,16 0,28 0,20
92 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
METRICS
GROUND TRUTH
• Evidence sources:
From relevance assessments [Morris et al., 2008]
From individual search logs [Foley and Smeaton, 2009b, Soulier et al., 2014b]
From collaborative search logs [Shah and Gonz´alez-Ib´a˜nez, 2011b, Soulier et al., 2014a]
• Importance of considering an agreement level of at least two users (belonging to
different groups?) [Shah and Gonz´alez-Ib´a˜nez, 2011b, Soulier et al., 2014a]
93 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
BASELINES
• Benefit of the collaboration
Individual models: BM25, LM, ...
Search logs of individual search
• Collaboration optimization through algorithmic mediation
User-driven approach with collaborative interfaces
• Benefit of roles
Role-based vs. No-role CIR models [Foley and Smeaton, 2009b, Morris et al., 2008]
Dynamic vs. predefined CIR models [Pickens et al., 2008, Shah et al., 2010]
• ...
94 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion
TOOLS AND DATASETS
• Simulation-based evaluation
TREC Interractive dataset [Over, 2001]
Other available search logs (TREC, CLEF, propritary, ...)
• Log-studies
Collaborative dataset [Tamine and Soulier, 2015]
• User-studies
open-source Coagmento plugin [Shah and Gonz´alez-Ib´a˜nez, 2011a]:
http://www.coagmento.org/collaboraty.php
95 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion
PLAN
1. Collaboration and Information Retrieval
2. Collaborative IR techniques and models
3. Evaluation
4. Challenges ahead
Theoretical foundations of CIR
Empirical evaluation of CIR
Open ideas
5. Discussion
96 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion
THEORETICAL FOUNDATIONS OF CIR
• Towards a novel probabilistic framework of relevance for CIR
What is a ”good ranking” with regard to the expected synergic effect of collaboration?
• Dynamic IR models for CIR
How to optimize long-term gains over multiple users, user-user interactions, user-system
interactions and multi-search sessions?
How to formalize the division of labor through the evolving of users’ information needs over
time?
• Towards an axiomatic approach of relevance for CIR
Are IR heuristics similar to CIR heuristics?
Can relevance towards a group be modeled by a set of formally defined constraints on a
retrieval function?
97 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion
EVALUATION OF CIR
• Multiple facets of system performance
Should we measure the performance in terms of gain per time, effort gain per user,
effectiveness of outcomes or all in a whole?
How do we delineate the performance of the system from the performance and interaction of
the users?
• Robust experiments for CIR
Should experimental evaluation protocol be task-dependent?
Are simulated work tasks used in IIR reasonable scenario for evaluating CIR scenario?
How to build data collections allowing reproducible experiments and handling robust
statistical tests?
98 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion
OPEN IDEAS
• Multi-level CIR [Htun et al., 2015]
Non-uniform information access within the group
Application domains: legacy, military, ...
• Collaborative group building
Task-based group building (information search, synthesis, sense-making,
question-answering...)
Leveraging users’ knowledge, collaboration abilities, information need perception
• Socio-collaborative IR [Morris, 2013]
Web search vs. social networking [Oeldorf-Hirsch et al., 2014]
Leveraging from the crowd to solve a user’s information need
99 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
PLAN
1. Collaboration and Information Retrieval
2. Collaborative IR techniques and models
3. Evaluation
4. Challenges ahead
5. Discussion
100 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
DISCUSSION
101 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
REFERENCES I
Amer-Yahia, S., Benedikt, M., and Bohannon, P. (2007).
Challenges in Searching Online Communities.
IEEE Data Engineering Bulletin, 30(2):23–31.
Amershi, S. and Morris, M. R. (2008).
CoSearch: a system for co-located collaborative web search.
In Proceedings of the Conference on Human Factors in Computing Systems, CHI ’08, pages 1647–1656. ACM.
Aneiros, M. and Morris, M. R. (2003).
Foundation of unconstrained collaborative web browsing with awareness.
In Proceedings of the International Conference on Web Intelligence, WI ’02, pages 8–25. ACM/IEEE.
Brin, S. and Page, L. (1998).
The Anatomy of a Large-scale Hypertextual Web Search Engine.
Computer Networks and ISDN Systems, 30(1-7):107–117.
Capra, R. (2013).
Information Seeking and Sharing in Design Teams.
In Proceedings of the ASIS&T Annual Meeting, ASIS&T ’13, pages 239–247. American Society for Information Science.
Diriye, A. and Golovchinsky, G. (2012).
Querium: A session-based collaborative search system.
In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’12, pages 583–584. Springer.
Dumais, S. T. (2014).
Putting searchers into search.
In Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 1–2.
102 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
REFERENCES II
Erickson, T. (2010).
A Social Proxy for Collective Search.
In Proceedings of the International Workshop on Collaborative Information Seeking, CSCW ’10. ACM.
Evans, B. M. and Chi, E. H. (2010).
An elaborated model of social search.
Information Processing & Management (IP&M), 46(6):656–678.
Foley, C. and Smeaton, A. F. (2009a).
Evaluation of Coordination Techniques in Synchronous Collaborative Information Retrieval.
CoRR, abs/0908.0.
Foley, C. and Smeaton, A. F. (2009b).
Synchronous Collaborative Information Retrieval: Techniques and Evaluation.
In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’09, pages 42–53. Springer.
Foley, C. and Smeaton, A. F. (2010).
Division of Labour and Sharing of Knowledge for Synchronous Collaborative Information Retrieval.
Information Processing & Management (IP&M), 46(6):762–772.
Foley, C., Smeaton, A. F., and Jones., G. (2008).
Collaborative and Social Information Retrieval and Access: Techniques for Improved User Modeling, chapter Combining.
IGI Global.
Foster, J. (2006).
Collaborative information seeking and retrieval.
Annual Review of Information Science & Technology (ARIST), 40(1):329–356.
103 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
REFERENCES III
Fuhr, N. (2008).
A probability ranking principle for interactive information retrieval.
Information Retrieval, 11(3):251–265.
Gauch, S., Chaffee, J., and Pretschner, A. (2003).
Ontology-based Personalized Search and Browsing.
Web Intelligence and Agent Systems (WIAS), 1(3-4):219–234.
Golovchinsky, G., Adcock, J., Pickens, J., Qvarfordt, P., and Back, M. (2008).
Cerchiamo: a collaborative exploratory search tool.
Proceedings of the Demo in Computer Supported Cooperative Work.
Golovchinsky, G., Diriye, A., and Pickens, J. (2011).
Designing for Collaboration in Information Seeking.
Proceedings of the ASIS&T Annual Meeting.
Golovchinsky, G., Pickens, J., and Back, M. (2009).
A Taxonomy of Collaboration in Online Information Seeking.
In Proceedings of the International Workshop on Collaborative Information Retrieval, CIR ’09.
Gonz´alez-Ib´a˜nez, R., Haseki, M., and Shah, C. (2013).
Lets search together, but not too close! An analysis of communication and performance in collaborative information seeking.
Information Processing & Management (IP&M), 49(5):1165–1179.
Gray, B. (1989).
Collaborating: finding common ground for multiparty problems.
Jossey Bass Business and Management Series. Jossey-Bass.
104 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
REFERENCES IV
Han, S., He, D., Yue, Z., and Jiang, J. (2016).
Contextual support for collaborative information retrieval.
In Proceedings of the International ACM SIGIR Conference on Human Information Interaction and Retrieval.
Hansen, P. and J¨arvelin, K. (2005).
Collaborative information retrieval in an information-intensive domain.
Information Processing & Management (IP&M), 41(5):1101–1119.
Hansen, P., Shah, C., and Klas, C.-P. (2015).
Collaborative information seeking. best practices, new domains and new thoughts.
Htun, N. N., Halvey, M., and Baillie, L. (2015).
Towards quantifying the impact of non-uniform information access in collaborative information retrieval.
In Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 843–846.
Imazu, M., Nakayama, S.-i., and Joho, H. (2011).
Effect of Explicit Roles on Collaborative Search in Travel Planning Task.
In Proceedings of the Asia Information Retrieval Societies Conference, AIRS ’11, pages 205–214. Springer.
Jin, X., Sloan, M., and Wang, J. (2013).
Interactive Exploratory Search for Multi Page Search Results.
In Proceedings of the International Conference on World Wide Web, WWW ’13, pages 655–666. ACM.
Joho, H., Hannah, D., and Jose, J. (2009).
Revisiting IR Techniques for Collaborative Search Strategies.
In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’09, pages 66–77. Springer.
Karunakaran, A., Reddy, M. C., and Spence, P. R. (2013).
Toward a model of collaborative information behavior in organizations.
Journal of the Association for Information Science and Technology (JASIST), 64(12):2437–2451.
105 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
REFERENCES V
Kelly, R. and Payne, S. J. (2013).
Division of labour in collaborative information seeking: Current approaches and future directions.
In Proceedings of the International Workshop on Collaborative Information Seeking, CSCW ’13. ACM.
Kraft, R., Maghoul, F., and Chang, C. C. (2005).
Y!Q: Contextual Search at the Point of Inspiration.
In Proceedings of the Conference on Information and Knowledge Management, CIKM ’05, pages 816–823. ACM.
Lavrenko, V. and Croft, W. B. (2001).
Relevance based language models.
In Proceedings of the Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’01, pages
120–127. ACM.
Liu, F., Yu, C., and Meng, W. (2004).
Personalized Web Search For Improving Retrieval Effectiveness.
IEEE Transactions on Knowledge and Data Engineering (TKDE), 16(1):28–40.
Morris, M. R. (2008).
A survey of collaborative web search practices.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’08, pages 1657–1660. ACM.
Morris, M. R. (2013).
Collaborative Search Revisited.
In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’13, pages 1181–1192. ACM.
Morris, M. R., Lombardo, J., and Wigdor, D. (2010).
WeSearch: supporting collaborative search and sensemaking on a tabletop display.
In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’10, pages 401–410. ACM.
106 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
REFERENCES VI
Morris, M. R., Paepcke, A., and Winograd, T. (2006).
TeamSearch: Comparing Techniques for Co-Present Collaborative Search of Digital Media.
In Proceedings of the International Workshop on Horizontal Interactive Human-Computer Systems, Tabletop ’06, pages 97–104. IEEE Computer
Society.
Morris, M. R. and Teevan, J. (2009).
Collaborative Web Search: Who, What, Where, When, and Why.
Synthesis Lectures on Information Concepts, Retrieval, and Services. Morgan & Claypool Publishers.
Morris, M. R., Teevan, J., and Bush, S. (2008).
Collaborative Web Search with Personalization: Groupization, Smart Splitting, and Group Hit-highlighting.
In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’08, pages 481–484. ACM.
Oeldorf-Hirsch, A., Hecht, B., Morris, M. R., Teevan, J., and Gergle, D. (2014).
To Search or to Ask: The Routing of Information Needs Between Traditional Search Engines and Social Networks.
In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’14, pages 16–27. ACM.
Over, P. (2001).
The TREC interactive track: an annotated bibliography.
Information Processing & Management (IP&M), 37(3):369–381.
Pal, A. and Counts, S. (2011).
Identifying topical authorities in microblogs.
In Proceedings of the Conference on Web Search and Data Mining, WSDM ’11, pages 45–54. ACM.
Pickens, J., Golovchinsky, G., Shah, C., Qvarfordt, P., and Back, M. (2008).
Algorithmic Mediation for Collaborative Exploratory Search.
In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’08, pages 315–322.
ACM.
107 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
REFERENCES VII
Ponte, J. M. and Croft, W. B. (1998).
A language modeling approach to information retrieval.
In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’98, pages 275–281.
ACM.
Resnick, P., Iacovou, N., Suchak, M., Bergstrom, P., and Riedl, J. (1994).
GroupLens: An Open Architecture for Collaborative Filtering of Netnews.
In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’94, pages 175–186. ACM.
Robertson, S. E. and Walker, S. (1994).
Some simple effective approximations to the 2-Poisson model for probabilistic weighted retrieval.
In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’94, pages 232–241.
ACM.
Robertson, S. E., Walker, S., Jones, S., Hancock-Beaulieu, M., and Gatford, M. (1995).
Okapi at TREC-3.
In Proceedings of the Text retrieval conference-3 (TREC-3), TREC ’95, pages 109–126.
Rocchio, J. J., editor (1971).
Relevance Feedback in Information Retrieval.
Prentice Hall.
Rodriguez Perez, J. A., Whiting, S., and Jose, J. M. (2011).
CoFox: A visual collaborative browser.
In Proceedings of the International Workshop on Collaborative Information Retrieval, CIKM ’11. ACM.
Salton, G. (1971).
A comparison between manual and automatic indexing method.
Journal of American Documentation, 2(1):61–71.
108 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
REFERENCES VIII
Shah, C. (2010).
Working in Collaboration - What, Why, and How?
In Proceedings of the International Workshop on Collaborative Information Seeking, CSCW ’10. ACM.
Shah, C. (2011).
A framework for supporting user-centric collaborative information seeking.
Number 2 in SIGIR ’11, page 88. ACM.
Shah, C. (2012).
Collaborative Information Seeking - The Art and Science of Making the Whole Greater than the Sum of All.
pages I–XXI, 1–185.
Shah, C. (2014).
Evaluating collaborative information seeking - synthesis, suggestions, and structure.
Journal of Information Science (JIS), 40(4):460–475.
Shah, C. and Gonz´alez-Ib´a˜nez, R. (2010).
Exploring Information Seeking Processes in Collaborative Search Tasks.
In Proceedings of the ASIS&T Annual Meeting, ASIS&T ’10, pages 60:1–60:10. American Society for Information Science.
Shah, C. and Gonz´alez-Ib´a˜nez, R. (2011a).
Coagmento - A System for Supporting Collaborative Information Seeking.
In Demo in Proceedings of Association for Information Science and Technology Annual Meeting, ASIST ’12, pages 9–12.
Shah, C. and Gonz´alez-Ib´a˜nez, R. (2011b).
Evaluating the Synergic Effect of Collaboration in Information Seeking.
In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’11, pages 913–922.
ACM.
109 / 111
1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion
REFERENCES IX
Shah, C. and Marchionini, G. (2010).
Awareness in collaborative information seeking.
Journal of the Association for Information Science and Technology (JASIST), 61(10):1970–1986.
Shah, C., Pickens, J., and Golovchinsky, G. (2010).
Role-based results redistribution for collaborative information retrieval.
Information Processing & Management (IP&M), 46(6):773–781.
Smeaton, A. F., Foley, C., Gurrin, C., Lee, H., and McGivney, S. (2006).
Collaborative Searching for Video Using the Fischlar System and a DiamondTouch Table.
In Proceedings of the International Workshop on Horizontal Interactive Human-Computer Systems, Tabletop ’06, pages 151–159. IEEE Computer
Society.
Smyth, B., Balfe, E., Boydell, O., Bradley, K., Briggs, P., Coyle, M., and Freyne, J. (2005).
A live-user evaluation of collaborative web search.
In Proceedings of the International Joint Conference on Artificial Intelligence, IJCAI ’05, pages 1419–1424.
Soulier, L., Shah, C., and Tamine, L. (2014a).
User-driven System-mediated Collaborative Information Retrieval.
In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’14, pages 485–494.
ACM.
Soulier, L., Tamine, L., and Bahsoun, W. (2014b).
On domain expertise-based roles in collaborative information retrieval.
Information Processing & Management (IP&M), 50(5):752–774.
Strijbos, J.-W., Martens, R. O. B. L., Jochems, W. M. G., and Broers, N. J. (2004).
The Effect of Functional Roles on Group Efficiency. Using Multilevel Modeling and Content Analysis to Investigate Computer-Supported
Collaboration in Small Groups.
Journal of Information Science (JIS), 35(2):195–229.
110 / 111
Collaborative Information Retrieval: Concepts, Models and Evaluation

Mais conteúdo relacionado

Mais procurados

Design based for lisbon 2011
Design based for lisbon 2011Design based for lisbon 2011
Design based for lisbon 2011Terry Anderson
 
Van der merwe
Van der merweVan der merwe
Van der merweanesah
 
OLC Research Summit - Part 2: Meet The DETA Research
OLC Research Summit - Part 2: Meet The DETA Research OLC Research Summit - Part 2: Meet The DETA Research
OLC Research Summit - Part 2: Meet The DETA Research Tanya Joosten
 
Activating Online Collaborative Communities
Activating Online Collaborative CommunitiesActivating Online Collaborative Communities
Activating Online Collaborative CommunitiesCommunitySense
 
Social Network Analysis: applications for education research
Social Network Analysis: applications for education researchSocial Network Analysis: applications for education research
Social Network Analysis: applications for education researchChristian Bokhove
 
Paper11_PenuelRielSabelli
Paper11_PenuelRielSabelliPaper11_PenuelRielSabelli
Paper11_PenuelRielSabelliwebuploader
 
#lak2013, Leuven, DC slides, #learninganalytics
#lak2013, Leuven, DC slides, #learninganalytics#lak2013, Leuven, DC slides, #learninganalytics
#lak2013, Leuven, DC slides, #learninganalyticsSoudé Fazeli
 
Basic principles of interaction for learning in web based environment
Basic principles of interaction for learning in web based environmentBasic principles of interaction for learning in web based environment
Basic principles of interaction for learning in web based environmentSu-Tuan Lulee
 
An insight into Educational Data Mining at Muğla Sıtkı Koçman University, Turkey
An insight into Educational Data Mining at Muğla Sıtkı Koçman University, TurkeyAn insight into Educational Data Mining at Muğla Sıtkı Koçman University, Turkey
An insight into Educational Data Mining at Muğla Sıtkı Koçman University, Turkeystrehlst
 
Human-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshop
Human-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshopHuman-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshop
Human-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshopPeter Brusilovsky
 
Pantidi
PantidiPantidi
Pantidianesah
 
Blending synchronous asynchronous
Blending synchronous asynchronousBlending synchronous asynchronous
Blending synchronous asynchronousLisa Yamagata-Lynch
 
Evidence based Practice
Evidence based PracticeEvidence based Practice
Evidence based PracticeBASPCAN
 
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...Katja Reuter, PhD
 
Choosing Collaborative Systems Ingram Parker
Choosing Collaborative Systems Ingram ParkerChoosing Collaborative Systems Ingram Parker
Choosing Collaborative Systems Ingram Parkeraingram
 
Effects of Developers’ Training on User-Developer Interactions in Information...
Effects of Developers’ Training on User-Developer Interactions in Information...Effects of Developers’ Training on User-Developer Interactions in Information...
Effects of Developers’ Training on User-Developer Interactions in Information...Jennifer McCauley
 
Human Interfaces to Artificial Intelligence in Education
Human Interfaces to Artificial Intelligence in EducationHuman Interfaces to Artificial Intelligence in Education
Human Interfaces to Artificial Intelligence in EducationPeter Brusilovsky
 

Mais procurados (20)

Design based for lisbon 2011
Design based for lisbon 2011Design based for lisbon 2011
Design based for lisbon 2011
 
Van der merwe
Van der merweVan der merwe
Van der merwe
 
OLC Research Summit - Part 2: Meet The DETA Research
OLC Research Summit - Part 2: Meet The DETA Research OLC Research Summit - Part 2: Meet The DETA Research
OLC Research Summit - Part 2: Meet The DETA Research
 
Cook invited talk Uni of Bristol
Cook invited talk Uni of BristolCook invited talk Uni of Bristol
Cook invited talk Uni of Bristol
 
Power point journal
Power point journalPower point journal
Power point journal
 
Activating Online Collaborative Communities
Activating Online Collaborative CommunitiesActivating Online Collaborative Communities
Activating Online Collaborative Communities
 
Social Network Analysis: applications for education research
Social Network Analysis: applications for education researchSocial Network Analysis: applications for education research
Social Network Analysis: applications for education research
 
Paper11_PenuelRielSabelli
Paper11_PenuelRielSabelliPaper11_PenuelRielSabelli
Paper11_PenuelRielSabelli
 
#lak2013, Leuven, DC slides, #learninganalytics
#lak2013, Leuven, DC slides, #learninganalytics#lak2013, Leuven, DC slides, #learninganalytics
#lak2013, Leuven, DC slides, #learninganalytics
 
Basic principles of interaction for learning in web based environment
Basic principles of interaction for learning in web based environmentBasic principles of interaction for learning in web based environment
Basic principles of interaction for learning in web based environment
 
An insight into Educational Data Mining at Muğla Sıtkı Koçman University, Turkey
An insight into Educational Data Mining at Muğla Sıtkı Koçman University, TurkeyAn insight into Educational Data Mining at Muğla Sıtkı Koçman University, Turkey
An insight into Educational Data Mining at Muğla Sıtkı Koçman University, Turkey
 
Human-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshop
Human-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshopHuman-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshop
Human-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshop
 
Pantidi
PantidiPantidi
Pantidi
 
Blending synchronous asynchronous
Blending synchronous asynchronousBlending synchronous asynchronous
Blending synchronous asynchronous
 
Evidence based Practice
Evidence based PracticeEvidence based Practice
Evidence based Practice
 
SN_for_CI
SN_for_CISN_for_CI
SN_for_CI
 
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
 
Choosing Collaborative Systems Ingram Parker
Choosing Collaborative Systems Ingram ParkerChoosing Collaborative Systems Ingram Parker
Choosing Collaborative Systems Ingram Parker
 
Effects of Developers’ Training on User-Developer Interactions in Information...
Effects of Developers’ Training on User-Developer Interactions in Information...Effects of Developers’ Training on User-Developer Interactions in Information...
Effects of Developers’ Training on User-Developer Interactions in Information...
 
Human Interfaces to Artificial Intelligence in Education
Human Interfaces to Artificial Intelligence in EducationHuman Interfaces to Artificial Intelligence in Education
Human Interfaces to Artificial Intelligence in Education
 

Destaque

Understanding the Impact of the Role Factor in Collaborative Information Retr...
Understanding the Impact of the Role Factor in Collaborative Information Retr...Understanding the Impact of the Role Factor in Collaborative Information Retr...
Understanding the Impact of the Role Factor in Collaborative Information Retr...UPMC - Sorbonne Universities
 
A Collaborative Document Ranking Model for a Multi-Faceted Search
A Collaborative Document Ranking Model for a Multi-Faceted SearchA Collaborative Document Ranking Model for a Multi-Faceted Search
A Collaborative Document Ranking Model for a Multi-Faceted SearchUPMC - Sorbonne Universities
 
Thesis slides - Definition and evluation of collaborative information retriev...
Thesis slides - Definition and evluation of collaborative information retriev...Thesis slides - Definition and evluation of collaborative information retriev...
Thesis slides - Definition and evluation of collaborative information retriev...UPMC - Sorbonne Universities
 
Un modèle de recherche d’information collaborative basé sur l’expertise des u...
Un modèle de recherche d’information collaborative basé sur l’expertise des u...Un modèle de recherche d’information collaborative basé sur l’expertise des u...
Un modèle de recherche d’information collaborative basé sur l’expertise des u...UPMC - Sorbonne Universities
 
Quels facteurs de pertinence pour la recherche de produits e-commerce ?
Quels facteurs de pertinence pour la recherche de produits e-commerce ?Quels facteurs de pertinence pour la recherche de produits e-commerce ?
Quels facteurs de pertinence pour la recherche de produits e-commerce ?Lamjed Ben Jabeur
 
Accès à l’information dans les réseaux sociaux : quelles formes de collaborat...
Accès à l’information dans les réseaux sociaux : quelles formes de collaborat...Accès à l’information dans les réseaux sociaux : quelles formes de collaborat...
Accès à l’information dans les réseaux sociaux : quelles formes de collaborat...Lamjed Ben Jabeur
 

Destaque (6)

Understanding the Impact of the Role Factor in Collaborative Information Retr...
Understanding the Impact of the Role Factor in Collaborative Information Retr...Understanding the Impact of the Role Factor in Collaborative Information Retr...
Understanding the Impact of the Role Factor in Collaborative Information Retr...
 
A Collaborative Document Ranking Model for a Multi-Faceted Search
A Collaborative Document Ranking Model for a Multi-Faceted SearchA Collaborative Document Ranking Model for a Multi-Faceted Search
A Collaborative Document Ranking Model for a Multi-Faceted Search
 
Thesis slides - Definition and evluation of collaborative information retriev...
Thesis slides - Definition and evluation of collaborative information retriev...Thesis slides - Definition and evluation of collaborative information retriev...
Thesis slides - Definition and evluation of collaborative information retriev...
 
Un modèle de recherche d’information collaborative basé sur l’expertise des u...
Un modèle de recherche d’information collaborative basé sur l’expertise des u...Un modèle de recherche d’information collaborative basé sur l’expertise des u...
Un modèle de recherche d’information collaborative basé sur l’expertise des u...
 
Quels facteurs de pertinence pour la recherche de produits e-commerce ?
Quels facteurs de pertinence pour la recherche de produits e-commerce ?Quels facteurs de pertinence pour la recherche de produits e-commerce ?
Quels facteurs de pertinence pour la recherche de produits e-commerce ?
 
Accès à l’information dans les réseaux sociaux : quelles formes de collaborat...
Accès à l’information dans les réseaux sociaux : quelles formes de collaborat...Accès à l’information dans les réseaux sociaux : quelles formes de collaborat...
Accès à l’information dans les réseaux sociaux : quelles formes de collaborat...
 

Semelhante a Collaborative Information Retrieval: Concepts, Models and Evaluation

MARKET RESEARCH WEEK LESSONN PLAN 5.pptx
MARKET RESEARCH WEEK LESSONN PLAN 5.pptxMARKET RESEARCH WEEK LESSONN PLAN 5.pptx
MARKET RESEARCH WEEK LESSONN PLAN 5.pptxPreciousChanaiwa
 
Research Methodology For A Researcher
Research Methodology For A ResearcherResearch Methodology For A Researcher
Research Methodology For A ResearcherRenee Wardowski
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdfMohdTaufiqIshak
 
FAIR landscape in ELIXIR: FAIR metrics and other initiatives
FAIR landscape in ELIXIR: FAIR metrics and other initiativesFAIR landscape in ELIXIR: FAIR metrics and other initiatives
FAIR landscape in ELIXIR: FAIR metrics and other initiativesPeter McQuilton
 
Oer panel
Oer panelOer panel
Oer panelgrainne
 
How to facilitate crowd participation - presentation in ISPIM 2013
How to facilitate crowd participation - presentation in ISPIM 2013How to facilitate crowd participation - presentation in ISPIM 2013
How to facilitate crowd participation - presentation in ISPIM 2013Miia Kosonen
 
Ask Not What the NIH Can Do For You; Ask What You Can Do For the NIH
Ask Not What the NIH Can Do For You; Ask What You Can Do For the NIH     Ask Not What the NIH Can Do For You; Ask What You Can Do For the NIH
Ask Not What the NIH Can Do For You; Ask What You Can Do For the NIH Philip Bourne
 
CAQDAS 2014 From graph paper to digital research our Framework journey
CAQDAS 2014 From graph paper to digital research our Framework journeyCAQDAS 2014 From graph paper to digital research our Framework journey
CAQDAS 2014 From graph paper to digital research our Framework journeyKandy Woodfield
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdfssuser9878d0
 
Data Interview and Data Management Plans
Data Interview and Data Management PlansData Interview and Data Management Plans
Data Interview and Data Management PlansJulie Goldman
 
A Literature Survey on Recommendation Systems for Scientific Articles.pdf
A Literature Survey on Recommendation Systems for Scientific Articles.pdfA Literature Survey on Recommendation Systems for Scientific Articles.pdf
A Literature Survey on Recommendation Systems for Scientific Articles.pdfAmber Ford
 
Getting research into action: issues, challenges, solutions by Dr Sarah Morton
Getting research into action: issues, challenges, solutions by Dr Sarah MortonGetting research into action: issues, challenges, solutions by Dr Sarah Morton
Getting research into action: issues, challenges, solutions by Dr Sarah MortonHazel Hall
 
Ethical challenges for learning analytics
Ethical challenges for learning analyticsEthical challenges for learning analytics
Ethical challenges for learning analyticsRebecca Ferguson
 
Survey Research Methods with Lynn Silipigni Connaway
Survey Research Methods with Lynn Silipigni ConnawaySurvey Research Methods with Lynn Silipigni Connaway
Survey Research Methods with Lynn Silipigni ConnawayLynn Connaway
 
Tools and techniques in qualitative and quantitative research
Tools and techniques in qualitative and quantitative researchTools and techniques in qualitative and quantitative research
Tools and techniques in qualitative and quantitative researchDeepikakohli10
 
Using Analytics to Improve Student Success
Using Analytics to Improve Student SuccessUsing Analytics to Improve Student Success
Using Analytics to Improve Student SuccessMatthew D. Pistilli
 
FerrellWalker.pptx
FerrellWalker.pptxFerrellWalker.pptx
FerrellWalker.pptxJisc
 

Semelhante a Collaborative Information Retrieval: Concepts, Models and Evaluation (20)

ASCD Presentation 16MAR09
ASCD Presentation 16MAR09ASCD Presentation 16MAR09
ASCD Presentation 16MAR09
 
LIS DREaM 1 Breakout session 2: Extending your research methods repertoire
LIS DREaM 1 Breakout session 2: Extending your research methods repertoireLIS DREaM 1 Breakout session 2: Extending your research methods repertoire
LIS DREaM 1 Breakout session 2: Extending your research methods repertoire
 
MARKET RESEARCH WEEK LESSONN PLAN 5.pptx
MARKET RESEARCH WEEK LESSONN PLAN 5.pptxMARKET RESEARCH WEEK LESSONN PLAN 5.pptx
MARKET RESEARCH WEEK LESSONN PLAN 5.pptx
 
Research Methodology For A Researcher
Research Methodology For A ResearcherResearch Methodology For A Researcher
Research Methodology For A Researcher
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdf
 
FAIR landscape in ELIXIR: FAIR metrics and other initiatives
FAIR landscape in ELIXIR: FAIR metrics and other initiativesFAIR landscape in ELIXIR: FAIR metrics and other initiatives
FAIR landscape in ELIXIR: FAIR metrics and other initiatives
 
Oer panel
Oer panelOer panel
Oer panel
 
How to facilitate crowd participation - presentation in ISPIM 2013
How to facilitate crowd participation - presentation in ISPIM 2013How to facilitate crowd participation - presentation in ISPIM 2013
How to facilitate crowd participation - presentation in ISPIM 2013
 
Ask Not What the NIH Can Do For You; Ask What You Can Do For the NIH
Ask Not What the NIH Can Do For You; Ask What You Can Do For the NIH     Ask Not What the NIH Can Do For You; Ask What You Can Do For the NIH
Ask Not What the NIH Can Do For You; Ask What You Can Do For the NIH
 
CAQDAS 2014 From graph paper to digital research our Framework journey
CAQDAS 2014 From graph paper to digital research our Framework journeyCAQDAS 2014 From graph paper to digital research our Framework journey
CAQDAS 2014 From graph paper to digital research our Framework journey
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdf
 
Data Interview and Data Management Plans
Data Interview and Data Management PlansData Interview and Data Management Plans
Data Interview and Data Management Plans
 
A Literature Survey on Recommendation Systems for Scientific Articles.pdf
A Literature Survey on Recommendation Systems for Scientific Articles.pdfA Literature Survey on Recommendation Systems for Scientific Articles.pdf
A Literature Survey on Recommendation Systems for Scientific Articles.pdf
 
Getting research into action: issues, challenges, solutions by Dr Sarah Morton
Getting research into action: issues, challenges, solutions by Dr Sarah MortonGetting research into action: issues, challenges, solutions by Dr Sarah Morton
Getting research into action: issues, challenges, solutions by Dr Sarah Morton
 
Ethical challenges for learning analytics
Ethical challenges for learning analyticsEthical challenges for learning analytics
Ethical challenges for learning analytics
 
Survey Research Methods with Lynn Silipigni Connaway
Survey Research Methods with Lynn Silipigni ConnawaySurvey Research Methods with Lynn Silipigni Connaway
Survey Research Methods with Lynn Silipigni Connaway
 
Tools and techniques in qualitative and quantitative research
Tools and techniques in qualitative and quantitative researchTools and techniques in qualitative and quantitative research
Tools and techniques in qualitative and quantitative research
 
Week2 chapters1 3
Week2 chapters1 3Week2 chapters1 3
Week2 chapters1 3
 
Using Analytics to Improve Student Success
Using Analytics to Improve Student SuccessUsing Analytics to Improve Student Success
Using Analytics to Improve Student Success
 
FerrellWalker.pptx
FerrellWalker.pptxFerrellWalker.pptx
FerrellWalker.pptx
 

Último

Physiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptxPhysiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptxAArockiyaNisha
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsAArockiyaNisha
 
Biological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfBiological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfmuntazimhurra
 
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSpermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSarthak Sekhar Mondal
 
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bNightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bSérgio Sacani
 
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...anilsa9823
 
Animal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxAnimal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxUmerFayaz5
 
Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )aarthirajkumar25
 
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRStunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRDelhi Call girls
 
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...Sérgio Sacani
 
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral AnalysisRaman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral AnalysisDiwakar Mishra
 
Unlocking the Potential: Deep dive into ocean of Ceramic Magnets.pptx
Unlocking  the Potential: Deep dive into ocean of Ceramic Magnets.pptxUnlocking  the Potential: Deep dive into ocean of Ceramic Magnets.pptx
Unlocking the Potential: Deep dive into ocean of Ceramic Magnets.pptxanandsmhk
 
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 bAsymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 bSérgio Sacani
 
fundamental of entomology all in one topics of entomology
fundamental of entomology all in one topics of entomologyfundamental of entomology all in one topics of entomology
fundamental of entomology all in one topics of entomologyDrAnita Sharma
 
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...Sérgio Sacani
 
GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)Areesha Ahmad
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTSérgio Sacani
 
Chromatin Structure | EUCHROMATIN | HETEROCHROMATIN
Chromatin Structure | EUCHROMATIN | HETEROCHROMATINChromatin Structure | EUCHROMATIN | HETEROCHROMATIN
Chromatin Structure | EUCHROMATIN | HETEROCHROMATINsankalpkumarsahoo174
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​kaibalyasahoo82800
 
GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)Areesha Ahmad
 

Último (20)

Physiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptxPhysiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptx
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based Nanomaterials
 
Biological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfBiological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdf
 
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSpermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
 
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bNightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
 
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow 💋 Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
 
Animal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxAnimal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptx
 
Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )
 
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRStunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
 
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
 
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral AnalysisRaman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
 
Unlocking the Potential: Deep dive into ocean of Ceramic Magnets.pptx
Unlocking  the Potential: Deep dive into ocean of Ceramic Magnets.pptxUnlocking  the Potential: Deep dive into ocean of Ceramic Magnets.pptx
Unlocking the Potential: Deep dive into ocean of Ceramic Magnets.pptx
 
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 bAsymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
 
fundamental of entomology all in one topics of entomology
fundamental of entomology all in one topics of entomologyfundamental of entomology all in one topics of entomology
fundamental of entomology all in one topics of entomology
 
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
 
GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOST
 
Chromatin Structure | EUCHROMATIN | HETEROCHROMATIN
Chromatin Structure | EUCHROMATIN | HETEROCHROMATINChromatin Structure | EUCHROMATIN | HETEROCHROMATIN
Chromatin Structure | EUCHROMATIN | HETEROCHROMATIN
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​
 
GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)
 

Collaborative Information Retrieval: Concepts, Models and Evaluation

  • 1. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion Collaborative Information Retrieval: Concepts, Models and Evaluation Lynda Tamine Paul Sabatier University IRIT, Toulouse - France Laure Soulier Pierre and Marie Curie University LIP6, Paris - France April 10, 2016 1 / 111
  • 2. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion OVERVIEW OF THE RESEARCH AREA c [Shah, 2012] • Publications Papers in several conferences (SIGIR, CIKM, ECIR, CHI, CSCW,...) and journals (IP&M, JASIST, JIR, IEEE, ...) Books on ”Collaborative Information Seeking” [Morris and Teevan, 2009, Shah, 2012, Hansen et al., 2015] Special issues on ”Collaborative Information Seeking” (IP&M, 2010; IEEE, 2014) • Workshops and Tutorials Collaborative Information Behavior: GROUP 2009 Collaborative Information Seeking: GROUP 2010, CSCW 2010, ASIST 2011 and CSCW 2013 Collaborative Information Retrieval: JCDL 2008 and CIKM 2011 Evaluation in Collaborative Information Retrieval: CIKM 2015 2 / 111
  • 3. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • On which occasion do you collaborate? Collaboration purposes 3 / 111
  • 4. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • On which occasion do you collaborate? Collaboration purposes Task Frequency Travel planing 27.5% Online shopping 25.7% Bibliographic search 20.2 % Technical search 16.5 % Fact-finding 16.5 % Social event planing 12.8 % Health search 6.4 % 3 / 111
  • 5. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • On which occasion do you collaborate? Collaboration purposes Task Frequency Travel planing 27.5% Online shopping 25.7% Bibliographic search 20.2 % Technical search 16.5 % Fact-finding 16.5 % Social event planing 12.8 % Health search 6.4 % Application domains 3 / 111
  • 6. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • On which occasion do you collaborate? Collaboration purposes Task Frequency Travel planing 27.5% Online shopping 25.7% Bibliographic search 20.2 % Technical search 16.5 % Fact-finding 16.5 % Social event planing 12.8 % Health search 6.4 % Application domains Domain Example Medical Physician/Patient - Physician/Nurse Digital library Librarians/Customers E-Discovery Fee-earners/Customers - Contact reviewer/Lead counsel Academic groups of students 3 / 111
  • 7. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • How do you collaborate? How often? 4 / 111
  • 8. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • How do you collaborate? How often? 4 / 111
  • 9. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • How do you collaborate? How often? Group size? 4 / 111
  • 10. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • How do you collaborate? How often? Group size? 4 / 111
  • 11. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • How do you collaborate? How often? Group size? Collaborative settings? 4 / 111
  • 12. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION IN FEW NUMBERS [MORRIS, 2008, MORRIS, 2013] • How do you collaborate? How often? Group size? Collaborative settings? 22% 11.9% 66.1% 4 / 111
  • 13. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion OUTLINE 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion 5 / 111
  • 14. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion PLAN 1. Collaboration and Information Retrieval Users and Information Retrieval The notion of collaboration Collaboration paradigms Collaborative search approaches Collaborative search interfaces 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion 6 / 111
  • 15. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion AD-HOC INFORMATION RETRIEVAL LET’S START BY WHAT YOU ALREADY KNOW... • Ranking documents with respect to a query • How? Term weighting/Document scoring [Robertson and Walker, 1994, Salton, 1971] Query Expansion/Reformulation [Rocchio, 1971] 7 / 111
  • 16. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USERS AND INFORMATION RETRIEVAL LET’S START BY WHAT YOU ALREADY KNOW... • Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004] Personalizing search results to user’s context, preferences and interests How? Modeling user’s profile Integrating the user’s context and preferences within the document scoring 8 / 111
  • 17. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USERS AND INFORMATION RETRIEVAL LET’S START BY WHAT YOU ALREADY KNOW... • Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004] Personalizing search results to user’s context, preferences and interests How? Modeling user’s profile Integrating the user’s context and preferences within the document scoring • Collaborative filtering [Resnick et al., 1994] Recommending search results using ratings/preferences of other users How? Inferring user’s own preferences from other users’ preferences Personalizing search results 8 / 111
  • 18. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USERS AND INFORMATION RETRIEVAL LET’S START BY WHAT YOU ALREADY KNOW... • Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004] Personalizing search results to user’s context, preferences and interests How? Modeling user’s profile Integrating the user’s context and preferences within the document scoring • Collaborative filtering [Resnick et al., 1994] Recommending search results using ratings/preferences of other users How? Inferring user’s own preferences from other users’ preferences Personalizing search results • Social Information Retrieval [Amer-Yahia et al., 2007, Pal and Counts, 2011] Exploiting social media platforms to retrieve document/users... How? Social network analysis (graph structure, information diffusion, ...) Integrating social-based features within the document relevance scoring 8 / 111
  • 19. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USERS AND INFORMATION RETRIEVAL LET’S START BY WHAT YOU ALREADY KNOW... • Personalized IR [Kraft et al., 2005, Gauch et al., 2003, Liu et al., 2004] Personalizing search results to user’s context, preferences and interests How? Modeling user’s profile Integrating the user’s context and preferences within the document scoring • Collaborative filtering [Resnick et al., 1994] Recommending search results using ratings/preferences of other users How? Inferring user’s own preferences from other users’ preferences Personalizing search results • Social Information Retrieval [Amer-Yahia et al., 2007, Pal and Counts, 2011] Exploiting social media platforms to retrieve document/users... How? Social network analysis (graph structure, information diffusion, ...) Integrating social-based features within the document relevance scoring Let’s have a more in-depth look on... Collaborative Information Retrieval 8 / 111
  • 20. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION DEFINITION Definition ‘A process through which parties who see different aspects of a problem can constructively explore their differences and search for solutions that go beyond their own limited vision of what is possible.” [Gray, 1989] Definition ‘Collaboration is a process in which autonomous actors interact through formal and informal negotiation, jointly creating rules and struc- tures governing their relationships and ways to act or decide on the issues that brought them together ; it is a process involving shared norms and mutually beneficial interactions.” [Thomson and Perry, 2006] 9 / 111
  • 21. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010] What? Tasks: Complex, exploratory or fact-finding tasks, ... Application domains: Bibliographic, medical, e-Discovery, academic search 10 / 111
  • 22. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010] What? Tasks: Complex, exploratory or fact-finding tasks, ... Application domains: Bibliographic, medical, e-Discovery, academic search Why? Shared interests Insufficient knowledge Mutual beneficial goals Division of labor 10 / 111
  • 23. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010] What? Tasks: Complex, exploratory or fact-finding tasks, ... Application domains: Bibliographic, medical, e-Discovery, academic search Why? Shared interests Insufficient knowledge Mutual beneficial goals Division of labor Who? Groups vs. Communities 10 / 111
  • 24. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010] What? Tasks: Complex, exploratory or fact-finding tasks, ... Application domains: Bibliographic, medical, e-Discovery, academic search Why? Shared interests Insufficient knowledge Mutual beneficial goals Division of labor Who? Groups vs. Communities When? Synchronous vs. Asynchronous 10 / 111
  • 25. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010] What? Tasks: Complex, exploratory or fact-finding tasks, ... Application domains: Bibliographic, medical, e-Discovery, academic search Why? Shared interests Insufficient knowledge Mutual beneficial goals Division of labor Who? Groups vs. Communities When? Synchronous vs. Asynchronous Where? Colocated vs. Remote 10 / 111
  • 26. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION THE 5WS OF THE COLLABORATION [MORRIS AND TEEVAN, 2009, SHAH, 2010] What? Tasks: Complex, exploratory or fact-finding tasks, ... Application domains: Bibliographic, medical, e-Discovery, academic search Why? Shared interests Insufficient knowledge Mutual beneficial goals Division of labor Who? Groups vs. Communities When? Synchronous vs. Asynchronous Where? Colocated vs. Remote How? Crowdsourcing Implicit vs. Explicit intent User mediation System mediation 10 / 111
  • 27. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION COLLABORATIVE INFORMATION RETRIEVAL (CIR) [FOSTER, 2006, GOLOVCHINSKY ET AL., 2009] 11 / 111
  • 28. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION COMPARING CIR WITH OTHER IR APPROACHES Exercice How do you think that CIR differs from Personalized IR, Collaborative Filtering, or Social IR? • User (unique/group) • Personalization (yes/no) • Collaboration (implicit/explicit) • Concurrency (collocated/remote) • Collaboration benefit (symmetric/asymmetric) • Communication (yes/no) • ... 12 / 111
  • 29. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion THE NOTION OF COLLABORATION COMPARING CIR WITH OTHER IR APPROACHES Exercice How do you think that CIR differs from Personalized IR, Collaborative Filtering, or Social IR? Perso. IR Collab. Filtering Social IR Collab. IR User unique group Personalization no yes Collaboration implicit explicit Concurrency synchronous asynchronous Benefit symmetric asymmetric Communication no yes Information usage Information exchange Information retrieval Information synthesis Sensemaking 13 / 111
  • 30. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION PARADIGMS [FOLEY AND SMEATON, 2010, KELLY AND PAYNE, 2013, SHAH AND MARCHIONINI, 2010] Division of labor • Role-based division of labor • Document-based division of labor 14 / 111
  • 31. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION PARADIGMS [FOLEY AND SMEATON, 2010, KELLY AND PAYNE, 2013, SHAH AND MARCHIONINI, 2010] Division of labor • Role-based division of labor • Document-based division of labor Sharing of knowledge • Communication and shared workspace • Ranking based on relevance judgements 14 / 111
  • 32. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATION PARADIGMS [FOLEY AND SMEATON, 2010, KELLY AND PAYNE, 2013, SHAH AND MARCHIONINI, 2010] Division of labor • Role-based division of labor • Document-based division of labor Sharing of knowledge • Communication and shared workspace • Ranking based on relevance judgements Awareness • Collaborators’ actions • Collaborators’ context 14 / 111
  • 33. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATIVE INFORMATION RETRIEVAL COLLABORATIVE SEARCH SESSION 15 / 111
  • 34. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion STRUCTURE OF THE COLLABORATIVE SEARCH SESSIONS • The 3 phases of the social search model [Evans and Chi, 2010] 16 / 111
  • 35. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion STRUCTURE OF THE COLLABORATIVE SEARCH SESSIONS • The 3 phases of the collaborators behavioral model [Karunakaran et al., 2013] 17 / 111
  • 36. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATIVE SEARCH APPROACHES [JOHO ET AL., 2009] • “Development of new IR models that can take collaboration into account in retrieval.” • “Leverage IR techniques such as relevance feedback, clustering, profiling, and data fusion to support collaborative search while using conventional IR models.” • “Develop search interfaces that allow people to perform search tasks in collaboration.interfaces” 18 / 111
  • 37. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATIVE SEARCH INTERFACES What could be collaborative in search interfaces [Shah, 2012, Thomson and Perry, 2006]: • Communication tools for defining search strategies, users’ roles as well as sharing relevant information [Golovchinsky et al., 2011, Kelly and Payne, 2013] • Awareness tools for reporting collaborators’ actions [Diriye and Golovchinsky, 2012, Rodriguez Perez et al., 2011] • Individual and shared workspace to ensure mutual beneficial goals • Algorithmic mediation to monitor collaborators’ actions 19 / 111
  • 38. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATIVE SEARCH INTERFACES What could be collaborative in search interfaces [Shah, 2012, Thomson and Perry, 2006]: • Communication tools for defining search strategies, users’ roles as well as sharing relevant information [Golovchinsky et al., 2011, Kelly and Payne, 2013] • Awareness tools for reporting collaborators’ actions [Diriye and Golovchinsky, 2012, Rodriguez Perez et al., 2011] • Individual and shared workspace to ensure mutual beneficial goals • Algorithmic mediation to monitor collaborators’ actions • User-driven collaborative interfaces Collaborators fully active Collaboration support through devices (interactive tabletop) or tools (web interfaces) • System-mediated collaborative interfaces Collaborators partially active Collaboration support through algorithmic mediation (e.g., document distribution according roles or not) 19 / 111
  • 39. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATIVE SEARCH INTERFACES USER-DRIVEN COLLABORATIVE INTERFACES • Coagmento [Shah and Gonz´alez-Ib´a˜nez, 2011a] 20 / 111
  • 40. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATIVE SEARCH INTERFACES USER-DRIVEN COLLABORATIVE INTERFACES • CoFox [Rodriguez Perez et al., 2011] Others interfaces: [Erickson, 2010] [Vivian and Dinet, 2008]... 21 / 111
  • 41. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATIVE SEARCH INTERFACES USER-DRIVEN COLLABORATIVE INTERFACES • TeamSearch [Morris et al., 2006] Others interfaces: Fischlar-DiamondTouch [Smeaton et al., 2006] - WeSearch [Morris et al., 2010]... 22 / 111
  • 42. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATIVE SEARCH INTERFACES SYSTEM-MEDIATED COLLABORATIVE INTERFACES • Cerchiamo [Golovchinsky et al., 2008] 23 / 111
  • 43. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion COLLABORATIVE SEARCH INTERFACES SYSTEM-MEDIATED COLLABORATIVE INTERFACES • Querium [Diriye and Golovchinsky, 2012] 24 / 111
  • 44. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion PLAN 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Challenges and issues Understanding Collaborative IR Overview System-mediated CIR models User-Driven System-mediated CIR models Roadmap 3. Evaluation 4. Challenges ahead 5. Discussion 25 / 111
  • 45. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion CHALLENGES • Conceptual models of IR: Static IR: system-based IR, does not learn from users eg. VSM [Salton, 1971], BM25 [Robertson et al., 1995] LM [Ponte and Croft, 1998], PageRank and Hits [Brin and Page, 1998] 26 / 111
  • 46. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion CHALLENGES • Conceptual models of IR: Static IR: system-based IR, does not learn from users eg. VSM [Salton, 1971], BM25 [Robertson et al., 1995] LM [Ponte and Croft, 1998], PageRank and Hits [Brin and Page, 1998] Interactive IR: exploiting feedback from users eg. Rocchio [Rocchio, 1971], Relevance-based LM [Lavrenko and Croft, 2001] 26 / 111
  • 47. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion CHALLENGES • Conceptual models of IR: Static IR: system-based IR, does not learn from users eg. VSM [Salton, 1971], BM25 [Robertson et al., 1995] LM [Ponte and Croft, 1998], PageRank and Hits [Brin and Page, 1998] Interactive IR: exploiting feedback from users eg. Rocchio [Rocchio, 1971], Relevance-based LM [Lavrenko and Croft, 2001] Dynamic IR: learning dynamically from past user-system interactions and predicts future eg. iPRP [Fuhr, 2008], interactive exploratory search [Jin et al., 2013] 26 / 111
  • 48. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion CHALLENGES • Conceptual models of IR: 27 / 111
  • 49. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion CHALLENGES • Conceptual models of IR: 27 / 111
  • 50. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion CHALLENGES 28 / 111
  • 51. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion CHALLENGES 1 Learning from user and user-user past interactions 2 Adaptation to multi-faceted and multi-user contexts: skills, expertise, role, etc. 3 Aggregating relevant information nuggets 4 Supporting synchronous vs. asynchronous coordination 5 Modeling collaboration paradigms: division of labor, sharing of knowledge 6 Optimizing the search cost: balance in work (search) and group benefit (task outcome) 29 / 111
  • 52. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Objectives 1 Investigating user behavior and search patterns Search processes [Shah and Gonz´alez-Ib´a˜nez, 2010, Yue et al., 2014] Search tactics and practices [Hansen and J¨arvelin, 2005, Morris, 2008, Morris, 2013, Amershi and Morris, 2008, Tao and Tombros, 2013, Capra, 2013] Role assignement [Imazu et al., 2011, Tamine and Soulier, 2015] 2 Studying the impact of collaborative search settings on performance Impact of collaboration on search performance [Shah and Gonz´alez-Ib´a˜nez, 2011b, Gonz´alez-Ib´a˜nez et al., 2013] 30 / 111
  • 53. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES • Study objective: Testing the feasibility of the Kuhlthau’s model of the information seking process in a collaborative information seeking situation [Shah and Gonz´alez-Ib´a˜nez, 2010] Stage Feeling Thoughts Actions (Affective) (Cognitive) Initiation Uncertainty General/Vague Actions Selection Optimism Exploration Confusion, Frustration, Doubt Seeking relevant informa- tion Formulation Clarity Narrowed, Clearer Collection Sense of direction, Confidence Increased interest Seeking relevant or focused information Presentation Relief, Satisfaction or disap- pointment Clearer or focused 31 / 111
  • 54. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES • Study objective: Testing the feasibility of the Kuhlthau’s model in collaborative information seeking situations [Shah and Gonz´alez-Ib´a˜nez, 2010] Participants: 42 dyads, students or university employees who already did a collaborative work together System: Coagmento 1 Sessions: two sessions (S1, S2) running in 7 main phases: (1) tutorial on system, (2) demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5) post-questionnaire, (6) report compilation, (7) questionnaire and interview Tasks: simulated work tasks. eg. Task 1: Economic recession ”A leading newspaper has hired your team to create a comprehensive report on the causes and consequences of the current economic recession in the US. As a part of your contract, you are required to collect all the relevant information from any available online sources that you can find. ... Your report on this topic should address the following issues: reasons behind this recession, effects on some major areas, such as health-care, home ownership, and financial sector (stock market), unemployment statistics over a period of time, proposal execution, and effects of the economy simulation plan, and people’s opinions and reactions on economy’s downfall” 1 http://www.coagmento.org/ 32 / 111
  • 55. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES • (Main) Study results: The Kuhlthau’s model stages map collaborative tasks • Initiation: number of chat messages at the stage and between stages • Selection: number of chat messages discussing the strategy • Exploration: number of search queries • Formulation: number of visited webpages • Collection: number of collected webpages • Presentation: number of moving actions for organizing collected snippets 33 / 111
  • 56. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: EXPLORING COLLABORATIVE SEARCH PROCESSES • (Main) Study results: The Kuhlthau’s model stages map collaborative tasks • Initiation: number of chat messages at the stage and between stages • Selection: number of chat messages discussing the strategy • Exploration: number of search queries • Formulation: number of visited webpages • Collection: number of collected webpages • Presentation: number of moving actions for organizing collected snippets 33 / 111
  • 57. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: EXPLORING SEARCH TACTICS AND PRACTICES • Study objective: Analyzing query (re)formulations and related term sources based on participants’ actions [Yue et al., 2014] Participants: 20 dyads, students who already knew each other in advance System: Collabsearch Session: one session running in running in 7 main phases: (1) tutorial on system, (2) demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5) post-questionnaire, (6) report compilation, (7) questionnaire and interview Tasks: (T1) academic literature search, (T2) travel planning 34 / 111
  • 58. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: EXPLORING SEARCH TACTICS AND PRACTICES • (Main) Study results: Individual action-based query reformulation (V, S, Q): No (significant) new findings Collaborative action-based query reformulation (SP, QP, C): Influence of communication (C) is task-dependent. Influence of collaborators’ queries (QP) is significantly higher than previous own queries (Q). Less influence of collaborators’ workspace (SP) than own workspace (S). • V: percentage of queries for which participants viewed results, one term originated from at least one page • S: percentage of queries for which participants saved results, one term originated from at least one page • Q: percentage of queries with at least one overlapping term with previous queries • SP: percentage of queries for which at least one term originated from collaborators’ workspace • QP: percentage of queries for which at least one term originated from collaborators’ previous queries • C: percentage of queries for which at least one term originated from collaborators’ communication 35 / 111
  • 59. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: STUDYING ROLE ASSIGNMENT • Study objective: Understanding differences in users’ behavior in role-oriented and non-role- oriented collaborative search sessions Participants: 75 dyads, students who already knew each other Settings: 25 dyads without roles, 50 dyads with roles (25 PM roles, 25 GS roles) System: open-source Coagmento plugin Session: one session running in 7 main phases: (1) tutorial on system, (2) demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5) post-questionnaire, (6) report compilation, (7) questionnaire and interview Tasks: Three (3) exploratory search tasks, topics from Interactive TREC track2 Tamine, L. and Soulier, L. (2015). Understanding the impact of the role factor in collaborative information retrieval. In Proceedings of the ACM International on Conference on Information and Knowledge Management, CIKM 15, pages 4352. 2 http://trec.nist.gov/data/t8i/t8i.html 36 / 111
  • 60. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: STUDYING ROLE ASSIGNMENT • (Main) Study results Users with assigned roles significantly behave differently than users with roles Mean(s.d.) npq dt nf qn ql qo nbm W/Role GS Group 1.71(1.06) 9.99(3.37) 58.52(27.13) 65.91(31.54) 4.64(1.11) 0.44(0.18) 20(14.50) IGDiffp -0.52 -3.47*** 1.30*** 2.09*** 1.16*** 0.14*** 2.23*** PM Group 1.88(1.53) 10.47(3.11) 56.31(27.95) 56.31(27.95) 2.79(0.70) 0.39(0.08) 15(12.88) IGDiffp 0.24*** 1.45*** -2.42*** -1.69*** 0.06*** 0-0.23*** 0.05*** W/oRole Group 2.09(1.01) 13.16(3.92) 24.13(12.81) 43.58(16.28) 3.67(0.67) 0.45(0.10) 19(11.34) p-value/GS *** *** *** *** *** *** p-value/PM *** *** *** *** *** *** * W/Role vs. W/oRole ANOVA p-val. ** *** ** * 37 / 111
  • 61. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: STUDYING ROLE ASSIGNMENT • (Main) Study results Early and high level of coordination of participants without role Role drift for participants with PM role 38 / 111
  • 62. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: EVALUATING THE IMPACT OF COLLABORATION ON SEARCH PERFORMANCE • Study objective: Evaluating the synergic effect of collaboration in information seeking [Shah and Gonz´alez-Ib´a˜nez, 2011b] Participants: 70 participants, 10 as single users, 30 as dyads Settings: C1 (single users), C2 (artificial formed teams), C3 (co-located teams, different computers), C4 (co-located teams, same computer), C5 remotely located teams System: Coagmento Session: one session running in running in 7 main phases: (1) tutorial on system, (2) demographic questionnaire, (3) task description, (4) timely-bounded task achievement, (5) post-questionnaire, (6) report compilation, (7) questionnaire and interview Tasks: One exploratory search task, topic ”gulf oil spill” 39 / 111
  • 63. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR GOAL: EVALUATING THE IMPACT OF COLLABORATION ON SEARCH PERFORMANCE • (Main) Study results Value of remote collaboration when the task has clear independent components Remotely located teams able to leverage real interactions leading to synergic collaboration Cognitive load in a collaborative setting not significantly higher than in an individual one 40 / 111
  • 64. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Lessons learned • Small-group (critical mass) collaborative search is a common practice despite the lack of specific tools 41 / 111
  • 65. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Lessons learned • Small-group (critical mass) collaborative search is a common practice despite the lack of specific tools • The whole is greater than the sum of all 41 / 111
  • 66. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Lessons learned • Small-group (critical mass) collaborative search is a common practice despite the lack of specific tools • The whole is greater than the sum of all • Collaborative search behavior differs from individual search behavior while some phases of theoretical models of individual search are still valid for collaborative search 41 / 111
  • 67. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Lessons learned • Small-group (critical mass) collaborative search is a common practice despite the lack of specific tools • The whole is greater than the sum of all • Collaborative search behavior differs from individual search behavior while some phases of theoretical models of individual search are still valid for collaborative search • Algorithmic mediation lowers the coordination cost 41 / 111
  • 68. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Lessons learned • Small-group (critical mass) collaborative search is a common practice despite the lack of specific tools • The whole is greater than the sum of all • Collaborative search behavior differs from individual search behavior while some phases of theoretical models of individual search are still valid for collaborative search • Algorithmic mediation lowers the coordination cost • Roles structure the collaboration but do not guarantee performance improvement in comparison to no roles 41 / 111
  • 69. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Lessons learned • Small-group (critical mass) collaborative search is a common practice despite the lack of specific tools • The whole is greater than the sum of all • Collaborative search behavior differs from individual search behavior while some phases of theoretical models of individual search are still valid for collaborative search • Algorithmic mediation lowers the coordination cost • Roles structure the collaboration but do not guarantee performance improvement in comparison to no roles Design implications: revisit IR models and techniques 41 / 111
  • 70. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Lessons learned • Small-group (critical mass) collaborative search is a common practice despite the lack of specific tools • The whole is greater than the sum of all • Collaborative search behavior differs from individual search behavior while some phases of theoretical models of individual search are still valid for collaborative search • Algorithmic mediation lowers the coordination cost • Roles structure the collaboration but do not guarantee performance improvement in comparison to no roles Design implications: revisit IR models and techniques • Back to the axiomatic relevance hypothesis (Fang et al. 2011) 41 / 111
  • 71. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Lessons learned • Small-group (critical mass) collaborative search is a common practice despite the lack of specific tools • The whole is greater than the sum of all • Collaborative search behavior differs from individual search behavior while some phases of theoretical models of individual search are still valid for collaborative search • Algorithmic mediation lowers the coordination cost • Roles structure the collaboration but do not guarantee performance improvement in comparison to no roles Design implications: revisit IR models and techniques • Back to the axiomatic relevance hypothesis (Fang et al. 2011) • Role as a novel variable in the IR models ? 41 / 111
  • 72. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion EMPIRICAL UNDERSTANDING OF CIR Lessons learned • Small-group (critical mass) collaborative search is a common practice despite the lack of specific tools • The whole is greater than the sum of all • Collaborative search behavior differs from individual search behavior while some phases of theoretical models of individual search are still valid for collaborative search • Algorithmic mediation lowers the coordination cost • Roles structure the collaboration but do not guarantee performance improvement in comparison to no roles Design implications: revisit IR models and techniques • Back to the axiomatic relevance hypothesis (Fang et al. 2011) • Role as a novel variable in the IR models ? • Learning to rank from user-system, user-user interactions within multi-session search tasks? 41 / 111
  • 73. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion OVERVIEW OF IR MODELS AND TECHNIQUES DESIGNING COLLABORATIVE IR MODELS: A YOUNG RESEARCH AREA 42 / 111
  • 74. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion OVERVIEW OF IR MODELS AND TECHNIQUES DESIGNING COLLABORATIVE IR MODELS: A YOUNG RESEARCH AREA 42 / 111
  • 75. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion OVERVIEW OF IR MODELS AND TECHNIQUES Collaborative IR models are based on algorithmic mediation: Systems re-use users’ search activity data to mediate the search • Data? Click-through data, queries, viewed results, result rankings, ... User-user communication • Mediation? Rooting/suggesting/enhance the queries Building personalized document rankings Automatically set-up division of labor 43 / 111
  • 76. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion OVERVIEW OF IR MODELS AND TECHNIQUES Collaborative IR models are based on algorithmic mediation: Systems re-use users’ search activity data to mediate the search • Data? Click-through data, queries, viewed results, result rankings, ... User-user communication • Mediation? Rooting/suggesting/enhance the queries Building personalized document rankings Automatically set-up division of labor 43 / 111
  • 77. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion OVERVIEW OF IR MODELS AND TECHNIQUES Notations Notation Description d Document q Query uj User j g Collaborative group ti term i RSV(d, q) Relevance Status Value given (d,q) N Document collection size ni Number of documents in the collection in which term ti occurs R Number of relevant documents in the collection Ruj Number of relevant documents in the collection for user uj r uj i Number of relevant documents of user uj in which term ti occurs 44 / 111
  • 78. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS USER GROUP-BASED MEDIATION • Enhancing collaborative search with users’ context [Morris et al., 2008, Foley and Smeaton, 2009a, Han et al., 2016] Division of labor: dividing the work by non-overlapping browsing Sharing of knowledge: exploiting personal relevance judgments, user’s authority 45 / 111
  • 79. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS USER/GROUP-BASED MEDIATION: GROUPIZATION, SMART SPLITTING, GROUP-HIGHLIGHTING [MORRIS ET AL., 2008] • Hypothesis setting: one or a few synchronous search query(ies) • 3 approaches Smart splitting: splitting top ranked web results using a round-robin technique, personalized-splitting of remaining results (document ranking level) Groupization: reusing individual personalization techniques towards groups (document ranking level) Hit Highlighting: highlighting user’s keywords (document browsing level) 46 / 111
  • 80. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS USER/GROUP-BASED MEDIATION: SMART-SPLITTING [MORRIS ET AL., 2008] Personalizing the document ranking: use the revisited BM25 weighting scheme [Teevan et al., 2005] RSV(d, q, uj) = ti∈d∩q wBM25(ti, uj) (1) wB2M5(ti, uj) = log (ri + 0.5)(N − ni − Ruj + r uj i + 0.5) (ni − r uj i + 0.5)(Ruj − r uj i + 0.5 (2) N = (N + Ruj ) (3) ni = ni + r uj i (4) 47 / 111
  • 81. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS USER/GROUP-BASED MEDIATION: SMART-SPLITTING [MORRIS ET AL., 2008] Example Smart-splitting according to personalized scores. 48 / 111
  • 82. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS USER/GROUP-BASED MEDIATION: COLLABORATIVE RELEVANCE FEEDBACK [FOLEY ET AL., 2008, FOLEY AND SMEATON, 2009B] • Hypothesis setting: multiple independent synchronous search queries • Collaborative relevance feedback: sharing collaborator’s explicit relevance judgments Aggregate the partial user relevance scores Compute the user’s authority weighting 49 / 111
  • 83. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS USER/GROUP-BASED MEDIATION: COLLABORATIVE RELEVANCE FEEDBACK [FOLEY ET AL., 2008, FOLEY AND SMEATON, 2009B] • A: Combining inputs of the RF process puwo(ti) = U−1 u=0 ruiwBM25(ti) (5) wBM25(ti) = log ( U−1 u=0 αu ru i Ru )(1 − U−1 u=0 αu ni − rui N − Ru ) ( U−1 u=0 αu ni − rui N − Ru )(1 − U−1 u=0 αu rui Ru ) (6) U−1 u=0 αu = 1 (7) 50 / 111
  • 84. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS USER/GROUP-BASED MEDIATION: COLLABORATIVE RELEVANCE FEEDBACK [FOLEY ET AL., 2008, FOLEY AND SMEATON, 2009B] • B: Combining outputs of the RF process crwo(ti) = U−1 u=0 αuwBM25(ti, u) (8) wBM25(ti, u) = log ( ru i Ru )(1 − ni − rui N − Ru ) ( ni − rui N − Ru )(1 − rui Ru ) (9) • C: Combining outputs of the ranking process RSV(d, q) = U−1 u=0 αuRSV(d, q, u) (10) RSV(d, q, u) = ti∈d∩q wBM25(ti, u) (11) 51 / 111
  • 85. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS USER/GROUP-BASED MEDIATION: CONTEXT-BASED COLLABORATIVE SEARCH [HAN ET AL., 2016] • Exploit a 3-dimensional context: Individual search history HQU: queries, results, bookmarks etc.) Collaborative group HCL: collaborators’ search history (queries, results, bookmarks etc.) Collaboration HCH: collaboration behavior chat (communication) 52 / 111
  • 86. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS USER/GROUP-BASED MEDIATION: CONTEXT-BASED COLLABORATIVE SEARCH [HAN ET AL., 2016] 1 Building a document ranking RSV(q, d) and generating Rank(d) 2 Building the document language model θd 3 Building the context language model θHx p(ti|Hx) = 1 K K k=1 p(ti|Xk) (12) p(ti|Xk) = nk Xk (13) 4 Computing the KL-divergence between θHx and θd D(θd, θHx ) = − ti p(ti|θd) log p(ti|Hx) (14) 5 Learning to rank using pairwise features (Rank(d), D(θd, θHx)) 53 / 111
  • 87. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION Enhancing collaborative search with user’s role [Pickens et al., 2008, Shah et al., 2010, Soulier et al., 2014b] • Division of labour: dividing the work based on users’ role peculiarities • Sharing of knowledge: splitting the search results 54 / 111
  • 88. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: PROSPECTOR AND MINER [PICKENS ET AL., 2008] • Prospector/Miner as functional roles supported by algorithms: Prospector: ”..opens new fields for exploration into a data collection..”. → Draws ideas from algorithmically suggested query terms Miner: ”..ensures that rich veins of information are explored...”. → Refines the search by judging highly ranked (unseen) documents • Collaborative system architecture: Algorithmic layer: functions combining users’ search activities to produce fitted outcomes to roles (queries, document rankings). Regulator layer: captures inputs (search activities), calls the appropriate functions of the algorithmic layer, roots the outputs of the algorithmic layer to the appropriate role (user). 55 / 111
  • 89. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: PROSPECTOR AND MINER [PICKENS ET AL., 2008] • Prospector function: The highly-relevant terms are suggested based on: Score(ti) = Lq∈L wr(Lq)wf (Lq)rlf(ti; Lq) (15) rlf(ti; Lq): number of documents in Lq in which ti occurs. • Miner function: The unseen documents are queued according to RSV(q, d) = Lq∈L wr(Lk)wf (Lq)borda(d; Lq) (16) wr(Lq) = |seen ∈ Lq| |seen ∈ Lq| (17) wf (Lq) = |rel ∈ Lq| |seen ∈ Lq| (18) 56 / 111
  • 90. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: GATHERER AND SURVEYOR [SHAH ET AL., 2010] • Gatherer/Surveyor as functional roles supported by algorithms: Gatherer: ”..scan results of joint search activity to discover most immediately relevant documents..”. Surveyor: ”..browse a wider diversity of information to get a better understanding of the collection being searched...”. • Main functions: Merging: merging (eg. CombSum) the documents rankings of collaborators Splitting: rooting the appropriate documents according to roles (eg. k-means clustering). High precision for the Gatherer, high diversity for the Surveyor 57 / 111
  • 91. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE Domain expert/Domain novice as knowledge-based roles supported by algorithms: • Domain expert: ”..represent problems at deep structural levels and are generally interested in discovering new associations among different aspects of items, or in delineating the advances in a research focus surrounding the query topic..”. • Domain novice: ”..represent problems in terms of surface or superficial aspects and are generally interested in enhancing their learning about the general query topic..”. Soulier, L., Tamine, L., and Bahsoun, W. (2014b). On domain expertise-based roles in collaborative information retrieval. Information Processing & Management (IP&M), 50(5):752774. 58 / 111
  • 92. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B] A two step algorithm: 1 Role-based document relevance scoring Pk (d|uj, q) ∝ Pk(uj|d) · Pk(d|q) (19) P(q|θd) ∝ (ti,wiq)∈q[λP(ti|θd) + (1 − λ)P(ti|θC)]wiq (20) Pk (uj|d) ∝ P(π(uj)k|θd) ∝ (ti,wk ij )∈π(uj)k [λk dj P(ti|θd) + (1 − λk dj )P(ti|θC)] wk ij (21) 59 / 111
  • 93. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B] A two step algorithm: 1 Role-based document relevance scoring : parameter smoothing using evidence from novelty and specificity λk dj = Nov(d, D(uj)k) · Spec(d)β maxd ∈D Nov(d, D(uj)k) · Spec(d )β (22) with β 1 if uj is an expert −1 if uj is a novice Novelty Nov(d, D(uj) k ) = mind ∈D(uj)k d(d, d ) (23) Specificity Spec(d) = avgti∈dspec(ti) = avgti∈d( −log( fdti N ) α ) (24) 60 / 111
  • 94. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B] A two step algorithm: 1 Document allocation to collaborators Classification-based on the Expectation Maximization algorithm (EM) E-step: Document probability of belonging to collaborator’s class P(Rj = 1|x k dj) = αk j · φk j (xk dj) αk j · φk j (xk dj ) + (1 − αk j ) · ψk j (xk dj ) (25) M-step : Parameter updating and likelihood estimation Document allocation to collaborators by comparison of document ranks within collaborators’ lists r k jj (d, δ k j , δ k j ) = 1 if rank(d, δk j ) < rank(d, δk j ) 0 otherwise (26) Division of labor: displaying distinct document lists between collaborators 61 / 111
  • 95. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B] Example Applying the Expert/Novice CIR model Let’s consider: • A collaborative search session with two users u1 (expert) and u2 (novice). • A shared information need I modeled through a query q. • A collection of 10 documents and their associated relevance score with respect to the shared information need I. t1 t2 t3 t4 q 1 0 1 0 d1 2 3 1 1 d2 0 0 5 3 d3 2 1 7 6 d4 4 1 0 0 d5 2 0 0 0 d6 3 0 0 0 d7 7 1 1 1 d8 3 3 3 3 d9 1 4 5 0 d10 0 0 4 0 Weighting vectors of documents and query: q = (0.5, 0, 0.5, 0) ; d1 = (0.29, 0.43, 0.14, 0.14) d2 = (0, 0, 0.63, 0.37) d3 = (0.12, 0.06, 0.44, 0.28) d4 = (0.8, 0.2, 0, 0) d5 = (1, 0, 0, 0) d6 = (0.3, 0, 0, 0.7) d7 = (0.7, 0.1, 0.1, 0.1) d8 = (0.25, 0.25, 0.25, 0.25) d9 = (0.1, 0.4, 0.5, 0) d10 = (0, 0, 1, 0). Users profile: π(u1)0 = π(u2)0 = q 62 / 111
  • 96. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B] Example Applying the Expert/Novice CIR model RSV(q, d) rank(d) Spec(d) d1 0.24 2 0.19 d2 0.02 7 0.23 d3 0.17 3 0.19 d4 0.03 6 0.15 d5 0.01 9 0.1 d6 0.02 8 0.1 d7 0.10 4 0.19 d8 0.31 1 0.19 d9 0.09 5 0.16 d10 0.01 10 0.15 • The document specificity is estimated as: α = 3 (If a term has a collection frequency equals to 1, −log(1/10) = 2.30) d1 = −log( 8 10 ) 3 −log( 6 10 ) 3 −log( 7 10 ) 3 −log( 5 10 ) 3 4 = 0.19 d2 = 0.23, d3 = 0.19, d4 = 0.15, d5 = 0.01, d6 = 0.1, d7 = 0.19, d8 = 0.19, d9 = 0.16, d10 = 0.15 • Iteration 0: Distributing top (6) documents to users: 3 most specific to the expert and the 3 less specific to the novice. Expert u1: l0 (u1, D0 ns) = {d8, d1, d3} Novice u2: l0 (u2, D0 ns) = {d7, d9, d4} 63 / 111
  • 97. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B] Example Applying the Expert/Novice CIR model • Iteration 1. Let’s consider that user u2 selected document d4 (D(u1)1 = {d4, d5}). Building the user’s profile. π(u1)1 = (0.5, 0, 0.5, 0) π(u2)1 = ( 0.5+0.8 2 , 0.2 2 , 0.5 2 , 0) = (0.65, 0.1, 0.25, 0). Estimating the document relevance with respect to collaborators. For user u1 : P1 (d1|u1) = P1 (d1|q) ∗ P1 (u1|d1) = 0.24 ∗ 0.22 = 0.05. P1 (d1|q) = 0.24. P1 (u1|d1) = (0.85 ∗ 2 7 + 0.15 ∗ 24 84 )0.05 + (0.85 ∗ 3 7 + 0.15 ∗ 13 84 )0 + (0.85 ∗ 1 7 + 0.15 ∗ 26 84 )0.05 + (0.85 ∗ 1 7 + 0.15 ∗ 21 84 )0 = 0.22 λ1 11 = 1∗0.19 0.23 = 0.85 where 0.19 expresses the specificity of document d1 and 1 is the document novelty score, and 0.23 the normalization score. The normalized document scores for each collaborators are the following: P1 (d|u1) P2 (d|u2) d1 0.23 0.28 d2 0 0.03 d3 0.16 0.11 d5 0.01 0.01 d6 0.03 0.02 d7 0.12 0.14 d8 0.34 0.34 d9 0.10 0.06 d10 0.01 0.01 64 / 111
  • 98. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion SYSTEM-MEDIATED CIR MODELS ROLE-BASED MEDIATION: DOMAIN EXPERT AND DOMAIN NOVICE [SOULIER ET AL., IP&M 2014B] Example Applying the Expert/Novice CIR model • Iteration 1. Let’s consider that user u2 selected document d4 (D(u1)1 = {d4, d5}). Building the user’s profile. π(u1)1 = (0.5, 0, 0.5, 0) π(u2)1 = ( 0.5+0.8 2 , 0.2 2 , 0.5 2 , 0) = (0.65, 0.1, 0.25, 0). Estimating the document relevance with respect to collaborators. For user u1 : P1 (d1|u1) = P1 (d1|q) ∗ P1 (u1|d1) = 0.24 ∗ 0.22 = 0.05. P1 (d1|q) = 0.24 since that the user’s profile has not evolve. λ1 11 = 1∗0.19 0.23 = 0.85 where 0.19 expresses the specificity of document d1 and 1 is the document novelty score, and 0.23 the normalization score. P1 (u1|d1) = (0.85 ∗ 2 7 + 0.15 ∗ 24 84 )0.05 + (0.85 ∗ 3 7 + 0.15 ∗ 13 84 )0 + (0.85 ∗ 1 7 + 0.15 ∗ 26 84 )0.05 + (0.85 ∗ 1 7 + 0.15 ∗ 21 84 )0 = 0.22 The normalized document scores for each collaborators are the following: P1 (d|u1) P2 (d|u2) d1 0.23 0.28 d2 0 0.03 d3 0.16 0.11 d5 0.01 0.01 d6 0.03 0.02 d7 0.12 0.14 d8 0.34 0.34 d9 0.10 0.06 d10 0.01 0.01 65 / 111
  • 99. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USER-DRIVEN SYSTEM-MEDIATED CIR MODELS MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH Soulier, L., Shah, C., and Tamine, L. (2014a). User-driven System-mediated Collaborative Information Retrieval. In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR 14, pages 485494. ACM. 66 / 111
  • 100. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USER-DRIVEN SYSTEM-MEDIATED CIR MODELS MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A] • Identifying users’ search behavior differences: estimating significance of differences using the Kolmogrov-Smirnov test • Characterizing users’ role 67 / 111
  • 101. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USER-DRIVEN SYSTEM-MEDIATED CIR MODELS MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A] • Categorizing users’ roles Ru argmin R1,2 ||FR1,2 C (tl) u1,u2 || (27) subject to : ∀ (fj,fk)∈K R1,2 FR1,2 (fj, fk) − C (tl) u1,u2 (fj, fk)) > −1 where defined as: FR1,2 (fj, fk) C (tl) u1,u2 (fj, fk) = FR1,2 (fj, fk) − C (tl) u1,u2 (fj, fk) if FR1,2 (fj, fk) ∈ {−1; 1} 0 otherwise • Personalizing the search: [Pickens et al., 2008, Shah, 2011]... 68 / 111
  • 102. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USER-DRIVEN SYSTEM-MEDIATED CIR MODELS MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A] • User’s roles modeled through patterns Intuition Number of visited documents Number of submitted queries Negative correlation Role pattern PR1,2 Search feature kernel KR1,2 Search feature-based correlation matrix FR1,2 F R1,2 =    1 if positively correlated −1 if negatively correlated 0 otherwise 69 / 111
  • 103. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USER-DRIVEN SYSTEM-MEDIATED CIR MODELS MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A] Example Mining role of collaborators A collaborative search session implies two users u1 and u2 aiming at identifying information dealing with “global warming”. We present search actions of collaborators for the 5 first minutes of the session. u t actions additional information u2 0 submitted query “global warming” u1 1 submitted query “global warming” u2 8 document d1: visited comment: “interesting” u2 12 document d2: visited u2 17 document d3: visited rated: 4/5 u2 19 document d4: visited u1 30 submitted query “greenhouse effect” u1 60 submitted query “global warming definition” u1 63 document d20: visited rated: 3/5 u1 70 submitted query “global warming protection” u1 75 document d21: visited u2 100 document d5: visited rated: 5/5 u2 110 document d6: visited rated: 4/5 u2 120 document d7: visited u1 130 submitted query “gas emission” u1 132 document d22: visited rated: 4/5 u2 150 document d8: visited u2 160 document d9: visited u2 170 document d10: visited u2 200 document d11: visited comment: “great” u2 220 document d12: visited u2 240 document d13: visited u1 245 submitted query “global warming world protection” u1 250 submitted query “causes temperature changes” u1 298 submitted query “global warming world politics” 70 / 111
  • 104. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USER-DRIVEN SYSTEM-MEDIATED CIR MODELS MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A] Example Mining role of collaborators: matching with role patterns • Role patterns Roles of reader-querier F Rread,querier = 1 −1 −1 1 , K Rread,querier = {(Nq, Np)} Role : (S (tl) u1 , S (tl) u2 , Rread,querier) → {(reader, querier), (querier, reader)} (S (tl) u1 , S (tl) u2 , Rread,querier) → (reader, querier) if S (tl) u1 (tl, Np) > S (tl) u2 (tl, Np) (querier, reader) otherwise Role of judge-querier F Rjudge,querier = 1 −1 −1 1 , K Rjudge,querier = {(Nq, Nc)} Role : (S (tl) u1 , S (tl) u2 , Rjudge,querier → {(judge, querier), (querier, judge)} (S (tl) u1 , S (tl) u2 , Rjudge,querier) → (judge, querier) if S (tl) u1 (tl, Nc) > S (tl) u2 (tl, Nc) (querier, judge) otherwise 71 / 111
  • 105. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USER-DRIVEN SYSTEM-MEDIATED CIR MODELS MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A] Example Mining role of collaborators • Track users’ behavior each 60 seconds • F = {Nq, Nd, Nc, Nr}, respectively, number of queries, documents, comments, ratings. • Users’ search behavior S (300) u1 =      3 0 0 0 4 2 0 1 5 3 0 2 5 3 0 2 8 3 0 2      S (300) u2 =      1 4 1 1 1 7 1 3 1 10 1 3 1 13 2 3 1 13 2 3      • Collaborators’ search differences (matrix and Kolmogorov-Smirnov test) ∆ (300) u1,u2 =      2 −4 −1 −1 3 −5 −1 −2 4 −7 −1 −1 4 −10 −2 −1 7 −10 −2 −1      - Number of queries : p (tl) u1,u2 (Nq) = 0.01348 - Number of pages : p (tl) u1,u2 (Nd) = 0.01348 - Number of comments : p (tl) u1,u2 (Nc) = 0.01348 - Number of ratings : p (tl) u1,u2 (Nr) = 0.08152 72 / 111
  • 106. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion USER-DRIVEN SYSTEM-MEDIATED CIR MODELS MINE USERS’ ROLES THEN PERSONALIZE THE SEARCH [SOULIER ET AL., SIGIR 2014A] Example Mining role of collaborators: matching with role patterns • Collaborators’ search action complementarity: correlation matrix between search differences C (300) u1,u2 =    1 −0.8186713 −0.731925 0 −0.8186713 1 0.9211324 0 −0.731925 0.9211324 1 0 0 0 0 0    • Role mining: comparing the role pattern with the sub-matrix of collaborators’ behaviors Role of reader-querier ||F Rread,querier C (300) u1,u2 || = 0 −1 − (−0.8186713) −1 − (−0.8186713) 0 = 0 0.183287 0.183287 0 The Frobenius norm is equals to: √ 0.1832872 = 0.183287. Role of judge-querier ||F Rjudge,querier C (300) u1,u2 || = 0 −1 − (−0.731925) −1 − (−0.731925) 0 = 0 0.268174 0.268174 0 The Frobenius norm is equals to: √ 0.2681742 = 0.268174. → Collaborators acts as reader/querier with u1 labeled as querier and u2 as reader (highest Np). 73 / 111
  • 107. 1. Collaboration and Information Retrieval Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion OVERVIEW OF IR MODELS AND TECHNIQUES [FoleyandSmeaton,2009a] [Morrisetal.,2008]“smart-splitting” [Morrisetal.,2008]“groupization” [Pickensetal.,2008] [Shahetal.,2010] [Soulieretal.,IP&M2014b] [Soulieretal.,SIGIR2014a] Relevance collective individual Evidence source feedback interest expertise behavior role Paradigm division of labor sharing of knowledge 74 / 111
  • 108. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PLAN 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Evaluation challenges Protocols Protocols Protocols Metrics and ground truth Baselines Tools and datasets 4. Challenges ahead 5. Discussion 75 / 111
  • 109. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion EVALUATION CHALLENGES • Learning from user and user-user past interactions • Adaptation to multi-faceted and multi-user contexts: skills, expertise, role, etc • Aggregating relevant information nuggets Evaluating the collective relevance • Supporting synchronous vs. asynchronous coordination • Modeling collaboration paradigms: division of labor, sharing of knowledge • Optimizing search cost: balance in work (search) and group benefit (task outcome) Measuring the collaborative effectiveness 76 / 111
  • 110. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS CATEGORIES OF PROTOCOLS • Standard evaluation frameworks Without humans: batch-based evaluation (TREC, INEX, CLEF, ...) 77 / 111
  • 111. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS CATEGORIES OF PROTOCOLS • Standard evaluation frameworks Without humans: batch-based evaluation (TREC, INEX, CLEF, ...) With humans in the process (recommended) c [Dumais, 2014] 78 / 111
  • 112. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS CATEGORIES OF PROTOCOLS • Standard evaluation frameworks Without humans: batch-based evaluation (TREC, INEX, CLEF, ...) With humans in the process (recommended) • CIR-adapted evaluation frameworks 79 / 111
  • 113. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS BATCH: COLLABORATION SIMULATION [MORRIS ET AL., 2008, SHAH ET AL., 2010] • Real users formulating queries w.r.t. the shared information need 15 individual users asked to list queries they would associate to 10 TREC topics. Then, pairs of collaborators are randomly built [Shah et al., 2010] 10 groups of 3 participants asked to list collaboratively 6 queries related to the information need [Morris et al., 2008] • Simulating the collaborative rankings on the participants’ queries 80 / 111
  • 114. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS BATCH: COLLABORATION SIMULATION [MORRIS ET AL., 2008, SHAH ET AL., 2010] • Real users formulating queries w.r.t. the shared information need 15 individual users asked to list queries they would associate to 10 TREC topics. Then, pairs of collaborators are randomly built [Shah et al., 2010] 10 groups of 3 participants asked to list collaboratively 6 queries related to the information need [Morris et al., 2008] • Simulating the collaborative rankings on the participants’ queries Advantages: • Larger number of experimental tests (parameter tuning, more baselines, ...) • Less costly and less time consuming than user studies Limitations: • Small manifestation of the collaborative aspects • No span of the collaborative search session • Difficult to evaluate the generalization of findings 80 / 111
  • 115. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B] • Individual search logs (from user studies or official benchmarks) 81 / 111
  • 116. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B] • Individual search logs (from user studies or official benchmarks) • Chronological synchronization of individual search actions 81 / 111
  • 117. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B] • Individual search logs (from user studies or official benchmarks) • Chronological synchronization of individual search actions • Simulating the collaborative rankings on the users’ queries 81 / 111
  • 118. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B] • Individual search logs (from user studies or official benchmarks) • Chronological synchronization of individual search actions • Simulate the collaborative rankings on the users’ queries 82 / 111
  • 119. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS LOG-STUDY: COLLABORATION SIMULATION [FOLEY AND SMEATON, 2009A, SOULIER ET AL., 2014B] • Individual search logs (from user studies or official benchmarks) • Chronological synchronization of individual search actions • Simulate the collaborative rankings on the users’ queries Advantages: • Modeling of a collaborative session • Larger number of experimental tests (parameter tuning, more baselines, ...) • Less costly and less time consuming than user studies Limitations: • Any manifestation of the collaborative aspects • Difficult to evaluate the generalization of findings 82 / 111
  • 120. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS LOG-STUDIES: COLLABORATIVE SEARCH LOGS [SOULIER ET AL., 2014A] • Real logs of collaborative search sessions • CIR ranking model launched on the participant queries 83 / 111
  • 121. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS LOG-STUDIES: COLLABORATIVE SEARCH LOGS [SOULIER ET AL., 2014A] • Real logs of collaborative search sessions • CIR ranking model launched on the participant queries Advantages: • A step forward to realistic collaborative scenarios • Queries resulting from a collaborative search process Limitations: • Costly and time-consuming, unless available data • Implicit feedback on the retrieved document lists 83 / 111
  • 122. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS USER-STUDIES [PICKENS ET AL., 2008] • Real users performing the collaborative task • CIR models launched in real time in response to users’ actions 84 / 111
  • 123. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion PROTOCOLS USER-STUDIES [PICKENS ET AL., 2008] • Real users performing the collaborative task • CIR models launched in real time in response to users’ actions Advantages: • One of the most realistic scenario (instead of panels) Limitations: • Costly and time-consuming • Controlled tasks in laboratory 84 / 111
  • 124. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS CATEGORIES OF METRICS Evaluation Objectives in collaborative search • Measuring the retrieval effectiveness of the ranking models • Measuring the search effectiveness of the collaborative groups • Measuring collaborators’ satisfaction and cognitive effort • Analyzing collaborators’ behavior • User-driven metrics/indicators aiming at evaluating: The collaborators’ awareness and satisfaction [Aneiros and Morris, 2003, Smyth et al., 2005] The cognitive effort The search outcomes • System-oriented metrics/indicators aiming at evaluating: The retrieval effectiveness of the ranking models The insurance of the collaborative paradigms of the ranking models (division of labor) The collaborative relevance of documents ( → ground truth) 85 / 111
  • 125. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS USER-DRIVEN METRICS • Search log analysis Behavioral analysis: collaborators’ actions [Tamine and Soulier, 2015] Feature Description npq Average number of visited pages by query dt Average time spent between two visited pages nf Average number of relevance feedback information (snippets, annotations & bookmarks) qn Average number of submitted queries ql Average number of query tokens qo Average ratio of shared tokens among successive queries nbm Average number of exchanged messages within the search groups 86 / 111
  • 126. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS USER-DRIVEN METRICS • Search log analysis Behavioral analysis: collaborators’ actions [Tamine and Soulier, 2015] Feature Description npq Average number of visited pages by query dt Average time spent between two visited pages nf Average number of relevance feedback information (snippets, annotations & bookmarks) qn Average number of submitted queries ql Average number of query tokens qo Average ratio of shared tokens among successive queries nbm Average number of exchanged messages within the search groups Behavioral analysis: communication channels [Gonz´alez-Ib´a˜nez et al., 2013, Strijbos et al., 2004] c 86 / 111
  • 127. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS USER-DRIVEN METRICS • Search log analysis Behavioral analysis: collaborators’ actions and communication channels Search outcomes [Shah, 2014] c Evidence sources Description Visit. doc. Rel. doc. Dwell-time Number of visits (Unique) Coverage (unique) visited webpages Likelihood of discovery number of visits-based IDF metric (Unique) Useful pages (unique) number of useful pages (visited more than 30 seconds) Precision number of distinct relevant and vis- ited pages over the number of dis- tinct visited pages Recall number of distinct relevant and vis- ited pages over the number of dis- tinct relevant pages F-measure Combinaison of precision and recall 87 / 111
  • 128. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS USER-DRIVEN METRICS Exercice Estimating the search outcome effectiveness of a collaborative search session (Coverage, Relevant Coverage, Precision, Recall, F-measure). • Let’s consider: a collaborative search session involving two users u1 and u2 aiming at solving an information need I. During the session, u1 selected the following documents: {d1, d2, d6, d9, d17, d20} During the session, u2 selected the following documents: {d3, d4, d5, d6, d7} a collection of 20 documents D = {d ; i = 1, ·, 20}, a ground truth for the information need I: GTI = {d2, d6, d15} 88 / 111
  • 129. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS USER-DRIVEN METRICS Exercice Estimating the search outcome effectiveness of a collaborative search session (Coverage, Relevant Coverage, Precision, Recall, F-measure). • Let’s consider: a collaborative search session involving two users u1 and u2 aiming at solving an information need I. During the session, u1 selected the following documents: {d1, d2, d6, d9, d17, d20} During the session, u2 selected the following documents: {d3, d4, d5, d6, d7} a collection of 20 documents D = {d ; i = 1, ·, 20}, a ground truth for the information need I: GTI = {d2, d6, d15} • Evaluation metrics: UniqueCoverage(g) = {d1, d2, d3, d4, d5, d6, d7, d9, d17, d20}. RelevantCoverage(g) = {d2, d6}. Precision(g) = 2 10 = 0.2 Recall(g) = 2 3 = 0.66 F − measure(g) = 2·0.2·0.66 0.2+0.66 = 0.33 88 / 111
  • 130. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS USER-DRIVEN METRICS • Questionnaires and interviews The “TLX instrument form”: measuring the cognitive effort c 89 / 111
  • 131. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS USER-DRIVEN METRICS • Questionnaires and interviews The “TLX instrument form”: measuring the cognitive effort Satisfaction interviews [Shah and Gonz´alez-Ib´a˜nez, 2011a, Tamine and Soulier, 2015] Question Answer type Have you already participated in such user study? If yes, please describe it. Free-answer What do you think about this collaborative man- ner of seeking information? Free-answer What was the level of difficulty of the task? a) Easy (Not difficult) b) Moder- ately difficult c) Difficult What was task difficulty related to? Free-answer Could you say that the collaborative system sup- ports your search? a) Yes b) Not totally c) Not at all How could we improve this system? Free-answer 89 / 111
  • 132. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS SYSTEM-ORIENTED METRICS [SOULIER ET AL., 2014A] • The precision Prec@R(g) at rank R of a collaborative group g: Prec@R(g) = 1 T(g) |T(g)| t=1 Prec@R(g)(t) = 1 T(g) |T(g)| t=1 RelCov@R(g)(t) Cov@R(g)(t) (28) • The recall Recall@R(g) at rank R of group g: Recall@R(g) = 1 T(g) |T(g)| t=1 Recall@R(g)(t) = 1 T(g) |T(g)| t=1 RelCov@R(g)(t) |RelDoc| (29) • The F-measure Fsyn@R(g) at rank R of a collaborative group g: F@R(g) = 1 T(g) |T(g)| t=1 2 ∗ Prec@R(g)(t) ∗ Recall@R(g)(t) Prec@R(g)(t) + Recall@R(g)(t) (30) 90 / 111
  • 133. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS SYSTEM-ORIENTED METRICS AND GROUND TRUTH Example Estimating the retrieval effectiveness of the rankings of CIR models (Coverage, Relevant Coverage, Precision, Recall, F-measure). Ground truth GTI = {d2, d6, d15} Query Document ranking q1 d1, d2, d3 q2 d2, d8, d14 q3 d17, d3, d8 q4 d9, d15, d2 q5 d1, d5, d3 q6 d20, d3, d1 q7 d5, d2, d4 91 / 111
  • 134. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS SYSTEM-ORIENTED METRICS AND GROUND TRUTH Example Estimating the retrieval effectiveness of the rankings of CIR models. Evaluation metrics: Query pairs Coverage Relevant Coverage Precision Recall F-measure q1-q2 d1, d2, d3, d8, d14 d2 1 5 1 3 0.25 q2-q3 d2, d8, d14, d17, d3 d2 1 5 1 3 0.25 q3-q4 d17, d3, d8, d9, d15 d15 1 5 1 3 0.25 q3-q7 d17, d3, d8, d5, d2, d4 d2 1 6 1 3 0.22 q5-q7 d1, d3, d5, d2, d4 - 0 0 0 q6-q7 d20, d3, d1, d5, d2, d4 d2 1 6 1 3 0.22 Average 0,16 0,28 0,20 92 / 111
  • 135. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion METRICS GROUND TRUTH • Evidence sources: From relevance assessments [Morris et al., 2008] From individual search logs [Foley and Smeaton, 2009b, Soulier et al., 2014b] From collaborative search logs [Shah and Gonz´alez-Ib´a˜nez, 2011b, Soulier et al., 2014a] • Importance of considering an agreement level of at least two users (belonging to different groups?) [Shah and Gonz´alez-Ib´a˜nez, 2011b, Soulier et al., 2014a] 93 / 111
  • 136. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion BASELINES • Benefit of the collaboration Individual models: BM25, LM, ... Search logs of individual search • Collaboration optimization through algorithmic mediation User-driven approach with collaborative interfaces • Benefit of roles Role-based vs. No-role CIR models [Foley and Smeaton, 2009b, Morris et al., 2008] Dynamic vs. predefined CIR models [Pickens et al., 2008, Shah et al., 2010] • ... 94 / 111
  • 137. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models Evaluation 4. Challenges ahead 5. Discussion TOOLS AND DATASETS • Simulation-based evaluation TREC Interractive dataset [Over, 2001] Other available search logs (TREC, CLEF, propritary, ...) • Log-studies Collaborative dataset [Tamine and Soulier, 2015] • User-studies open-source Coagmento plugin [Shah and Gonz´alez-Ib´a˜nez, 2011a]: http://www.coagmento.org/collaboraty.php 95 / 111
  • 138. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion PLAN 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Theoretical foundations of CIR Empirical evaluation of CIR Open ideas 5. Discussion 96 / 111
  • 139. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion THEORETICAL FOUNDATIONS OF CIR • Towards a novel probabilistic framework of relevance for CIR What is a ”good ranking” with regard to the expected synergic effect of collaboration? • Dynamic IR models for CIR How to optimize long-term gains over multiple users, user-user interactions, user-system interactions and multi-search sessions? How to formalize the division of labor through the evolving of users’ information needs over time? • Towards an axiomatic approach of relevance for CIR Are IR heuristics similar to CIR heuristics? Can relevance towards a group be modeled by a set of formally defined constraints on a retrieval function? 97 / 111
  • 140. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion EVALUATION OF CIR • Multiple facets of system performance Should we measure the performance in terms of gain per time, effort gain per user, effectiveness of outcomes or all in a whole? How do we delineate the performance of the system from the performance and interaction of the users? • Robust experiments for CIR Should experimental evaluation protocol be task-dependent? Are simulated work tasks used in IIR reasonable scenario for evaluating CIR scenario? How to build data collections allowing reproducible experiments and handling robust statistical tests? 98 / 111
  • 141. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation Challenges ahead 5. Discussion OPEN IDEAS • Multi-level CIR [Htun et al., 2015] Non-uniform information access within the group Application domains: legacy, military, ... • Collaborative group building Task-based group building (information search, synthesis, sense-making, question-answering...) Leveraging users’ knowledge, collaboration abilities, information need perception • Socio-collaborative IR [Morris, 2013] Web search vs. social networking [Oeldorf-Hirsch et al., 2014] Leveraging from the crowd to solve a user’s information need 99 / 111
  • 142. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion PLAN 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead 5. Discussion 100 / 111
  • 143. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion DISCUSSION 101 / 111
  • 144. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion REFERENCES I Amer-Yahia, S., Benedikt, M., and Bohannon, P. (2007). Challenges in Searching Online Communities. IEEE Data Engineering Bulletin, 30(2):23–31. Amershi, S. and Morris, M. R. (2008). CoSearch: a system for co-located collaborative web search. In Proceedings of the Conference on Human Factors in Computing Systems, CHI ’08, pages 1647–1656. ACM. Aneiros, M. and Morris, M. R. (2003). Foundation of unconstrained collaborative web browsing with awareness. In Proceedings of the International Conference on Web Intelligence, WI ’02, pages 8–25. ACM/IEEE. Brin, S. and Page, L. (1998). The Anatomy of a Large-scale Hypertextual Web Search Engine. Computer Networks and ISDN Systems, 30(1-7):107–117. Capra, R. (2013). Information Seeking and Sharing in Design Teams. In Proceedings of the ASIS&T Annual Meeting, ASIS&T ’13, pages 239–247. American Society for Information Science. Diriye, A. and Golovchinsky, G. (2012). Querium: A session-based collaborative search system. In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’12, pages 583–584. Springer. Dumais, S. T. (2014). Putting searchers into search. In Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 1–2. 102 / 111
  • 145. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion REFERENCES II Erickson, T. (2010). A Social Proxy for Collective Search. In Proceedings of the International Workshop on Collaborative Information Seeking, CSCW ’10. ACM. Evans, B. M. and Chi, E. H. (2010). An elaborated model of social search. Information Processing & Management (IP&M), 46(6):656–678. Foley, C. and Smeaton, A. F. (2009a). Evaluation of Coordination Techniques in Synchronous Collaborative Information Retrieval. CoRR, abs/0908.0. Foley, C. and Smeaton, A. F. (2009b). Synchronous Collaborative Information Retrieval: Techniques and Evaluation. In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’09, pages 42–53. Springer. Foley, C. and Smeaton, A. F. (2010). Division of Labour and Sharing of Knowledge for Synchronous Collaborative Information Retrieval. Information Processing & Management (IP&M), 46(6):762–772. Foley, C., Smeaton, A. F., and Jones., G. (2008). Collaborative and Social Information Retrieval and Access: Techniques for Improved User Modeling, chapter Combining. IGI Global. Foster, J. (2006). Collaborative information seeking and retrieval. Annual Review of Information Science & Technology (ARIST), 40(1):329–356. 103 / 111
  • 146. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion REFERENCES III Fuhr, N. (2008). A probability ranking principle for interactive information retrieval. Information Retrieval, 11(3):251–265. Gauch, S., Chaffee, J., and Pretschner, A. (2003). Ontology-based Personalized Search and Browsing. Web Intelligence and Agent Systems (WIAS), 1(3-4):219–234. Golovchinsky, G., Adcock, J., Pickens, J., Qvarfordt, P., and Back, M. (2008). Cerchiamo: a collaborative exploratory search tool. Proceedings of the Demo in Computer Supported Cooperative Work. Golovchinsky, G., Diriye, A., and Pickens, J. (2011). Designing for Collaboration in Information Seeking. Proceedings of the ASIS&T Annual Meeting. Golovchinsky, G., Pickens, J., and Back, M. (2009). A Taxonomy of Collaboration in Online Information Seeking. In Proceedings of the International Workshop on Collaborative Information Retrieval, CIR ’09. Gonz´alez-Ib´a˜nez, R., Haseki, M., and Shah, C. (2013). Lets search together, but not too close! An analysis of communication and performance in collaborative information seeking. Information Processing & Management (IP&M), 49(5):1165–1179. Gray, B. (1989). Collaborating: finding common ground for multiparty problems. Jossey Bass Business and Management Series. Jossey-Bass. 104 / 111
  • 147. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion REFERENCES IV Han, S., He, D., Yue, Z., and Jiang, J. (2016). Contextual support for collaborative information retrieval. In Proceedings of the International ACM SIGIR Conference on Human Information Interaction and Retrieval. Hansen, P. and J¨arvelin, K. (2005). Collaborative information retrieval in an information-intensive domain. Information Processing & Management (IP&M), 41(5):1101–1119. Hansen, P., Shah, C., and Klas, C.-P. (2015). Collaborative information seeking. best practices, new domains and new thoughts. Htun, N. N., Halvey, M., and Baillie, L. (2015). Towards quantifying the impact of non-uniform information access in collaborative information retrieval. In Proceedings of the International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 843–846. Imazu, M., Nakayama, S.-i., and Joho, H. (2011). Effect of Explicit Roles on Collaborative Search in Travel Planning Task. In Proceedings of the Asia Information Retrieval Societies Conference, AIRS ’11, pages 205–214. Springer. Jin, X., Sloan, M., and Wang, J. (2013). Interactive Exploratory Search for Multi Page Search Results. In Proceedings of the International Conference on World Wide Web, WWW ’13, pages 655–666. ACM. Joho, H., Hannah, D., and Jose, J. (2009). Revisiting IR Techniques for Collaborative Search Strategies. In Proceedings of the European Conference on Advances in Information Retrieval, ECIR ’09, pages 66–77. Springer. Karunakaran, A., Reddy, M. C., and Spence, P. R. (2013). Toward a model of collaborative information behavior in organizations. Journal of the Association for Information Science and Technology (JASIST), 64(12):2437–2451. 105 / 111
  • 148. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion REFERENCES V Kelly, R. and Payne, S. J. (2013). Division of labour in collaborative information seeking: Current approaches and future directions. In Proceedings of the International Workshop on Collaborative Information Seeking, CSCW ’13. ACM. Kraft, R., Maghoul, F., and Chang, C. C. (2005). Y!Q: Contextual Search at the Point of Inspiration. In Proceedings of the Conference on Information and Knowledge Management, CIKM ’05, pages 816–823. ACM. Lavrenko, V. and Croft, W. B. (2001). Relevance based language models. In Proceedings of the Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’01, pages 120–127. ACM. Liu, F., Yu, C., and Meng, W. (2004). Personalized Web Search For Improving Retrieval Effectiveness. IEEE Transactions on Knowledge and Data Engineering (TKDE), 16(1):28–40. Morris, M. R. (2008). A survey of collaborative web search practices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’08, pages 1657–1660. ACM. Morris, M. R. (2013). Collaborative Search Revisited. In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’13, pages 1181–1192. ACM. Morris, M. R., Lombardo, J., and Wigdor, D. (2010). WeSearch: supporting collaborative search and sensemaking on a tabletop display. In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’10, pages 401–410. ACM. 106 / 111
  • 149. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion REFERENCES VI Morris, M. R., Paepcke, A., and Winograd, T. (2006). TeamSearch: Comparing Techniques for Co-Present Collaborative Search of Digital Media. In Proceedings of the International Workshop on Horizontal Interactive Human-Computer Systems, Tabletop ’06, pages 97–104. IEEE Computer Society. Morris, M. R. and Teevan, J. (2009). Collaborative Web Search: Who, What, Where, When, and Why. Synthesis Lectures on Information Concepts, Retrieval, and Services. Morgan & Claypool Publishers. Morris, M. R., Teevan, J., and Bush, S. (2008). Collaborative Web Search with Personalization: Groupization, Smart Splitting, and Group Hit-highlighting. In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’08, pages 481–484. ACM. Oeldorf-Hirsch, A., Hecht, B., Morris, M. R., Teevan, J., and Gergle, D. (2014). To Search or to Ask: The Routing of Information Needs Between Traditional Search Engines and Social Networks. In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’14, pages 16–27. ACM. Over, P. (2001). The TREC interactive track: an annotated bibliography. Information Processing & Management (IP&M), 37(3):369–381. Pal, A. and Counts, S. (2011). Identifying topical authorities in microblogs. In Proceedings of the Conference on Web Search and Data Mining, WSDM ’11, pages 45–54. ACM. Pickens, J., Golovchinsky, G., Shah, C., Qvarfordt, P., and Back, M. (2008). Algorithmic Mediation for Collaborative Exploratory Search. In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’08, pages 315–322. ACM. 107 / 111
  • 150. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion REFERENCES VII Ponte, J. M. and Croft, W. B. (1998). A language modeling approach to information retrieval. In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’98, pages 275–281. ACM. Resnick, P., Iacovou, N., Suchak, M., Bergstrom, P., and Riedl, J. (1994). GroupLens: An Open Architecture for Collaborative Filtering of Netnews. In Proceedings of the Conference on Computer Supported Cooperative Work, CSCW ’94, pages 175–186. ACM. Robertson, S. E. and Walker, S. (1994). Some simple effective approximations to the 2-Poisson model for probabilistic weighted retrieval. In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’94, pages 232–241. ACM. Robertson, S. E., Walker, S., Jones, S., Hancock-Beaulieu, M., and Gatford, M. (1995). Okapi at TREC-3. In Proceedings of the Text retrieval conference-3 (TREC-3), TREC ’95, pages 109–126. Rocchio, J. J., editor (1971). Relevance Feedback in Information Retrieval. Prentice Hall. Rodriguez Perez, J. A., Whiting, S., and Jose, J. M. (2011). CoFox: A visual collaborative browser. In Proceedings of the International Workshop on Collaborative Information Retrieval, CIKM ’11. ACM. Salton, G. (1971). A comparison between manual and automatic indexing method. Journal of American Documentation, 2(1):61–71. 108 / 111
  • 151. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion REFERENCES VIII Shah, C. (2010). Working in Collaboration - What, Why, and How? In Proceedings of the International Workshop on Collaborative Information Seeking, CSCW ’10. ACM. Shah, C. (2011). A framework for supporting user-centric collaborative information seeking. Number 2 in SIGIR ’11, page 88. ACM. Shah, C. (2012). Collaborative Information Seeking - The Art and Science of Making the Whole Greater than the Sum of All. pages I–XXI, 1–185. Shah, C. (2014). Evaluating collaborative information seeking - synthesis, suggestions, and structure. Journal of Information Science (JIS), 40(4):460–475. Shah, C. and Gonz´alez-Ib´a˜nez, R. (2010). Exploring Information Seeking Processes in Collaborative Search Tasks. In Proceedings of the ASIS&T Annual Meeting, ASIS&T ’10, pages 60:1–60:10. American Society for Information Science. Shah, C. and Gonz´alez-Ib´a˜nez, R. (2011a). Coagmento - A System for Supporting Collaborative Information Seeking. In Demo in Proceedings of Association for Information Science and Technology Annual Meeting, ASIST ’12, pages 9–12. Shah, C. and Gonz´alez-Ib´a˜nez, R. (2011b). Evaluating the Synergic Effect of Collaboration in Information Seeking. In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’11, pages 913–922. ACM. 109 / 111
  • 152. 1. Collaboration and Information Retrieval 2. Collaborative IR techniques and models 3. Evaluation 4. Challenges ahead Discussion REFERENCES IX Shah, C. and Marchionini, G. (2010). Awareness in collaborative information seeking. Journal of the Association for Information Science and Technology (JASIST), 61(10):1970–1986. Shah, C., Pickens, J., and Golovchinsky, G. (2010). Role-based results redistribution for collaborative information retrieval. Information Processing & Management (IP&M), 46(6):773–781. Smeaton, A. F., Foley, C., Gurrin, C., Lee, H., and McGivney, S. (2006). Collaborative Searching for Video Using the Fischlar System and a DiamondTouch Table. In Proceedings of the International Workshop on Horizontal Interactive Human-Computer Systems, Tabletop ’06, pages 151–159. IEEE Computer Society. Smyth, B., Balfe, E., Boydell, O., Bradley, K., Briggs, P., Coyle, M., and Freyne, J. (2005). A live-user evaluation of collaborative web search. In Proceedings of the International Joint Conference on Artificial Intelligence, IJCAI ’05, pages 1419–1424. Soulier, L., Shah, C., and Tamine, L. (2014a). User-driven System-mediated Collaborative Information Retrieval. In Proceedings of the Annual International SIGIR Conference on Research and Development in Information Retrieval, SIGIR ’14, pages 485–494. ACM. Soulier, L., Tamine, L., and Bahsoun, W. (2014b). On domain expertise-based roles in collaborative information retrieval. Information Processing & Management (IP&M), 50(5):752–774. Strijbos, J.-W., Martens, R. O. B. L., Jochems, W. M. G., and Broers, N. J. (2004). The Effect of Functional Roles on Group Efficiency. Using Multilevel Modeling and Content Analysis to Investigate Computer-Supported Collaboration in Small Groups. Journal of Information Science (JIS), 35(2):195–229. 110 / 111