1. Search and Hyperlinking Task at MediaEval 2012
Search and Hyperlinking Task
at MediaEval 2012
Maria Eskevich1 , Gareth J.F. Jones1 , Shu Chen1
Robin Aly2 , Roeland Ordelman2
Martha Larson3
1 Dublin City University, Dublin, Ireland
2 University of Twente, The Netherlands
3 Delft University of Technology, Delft, The Netherlands
5. Search and Hyperlinking Task at MediaEval 2012
Brave New Task: Search and Hyperlinking
Previous work:
MediaEval 2011: Rich Speech Retrieval Task
VideoCLEF 2009: Linking Task
6. Search and Hyperlinking Task at MediaEval 2012
Brave New Task: Search and Hyperlinking
Previous work:
MediaEval 2011: Rich Speech Retrieval Task
VideoCLEF 2009: Linking Task
What is Brave New about the Search and Hyperlinking?
Unified scenario for 2 tasks:
Search output results as input for Linking
Use of crowdsourcing for results assessment:
workers define the relevance of the video segments,
the HIT can be run for each new submission and
enrich overall results
Search: ME10WWW dataset − > blip10000
7. Search and Hyperlinking Task at MediaEval 2012
Sub-tasks
Search sub-task:
Known-item search:
Textual queries only
Multimodal queries (text + video clues)
Required runs:
1 submission for 1-best output of each ASR transcript
Linking sub-task
Ad-hoc search for videos that can potentially be linked to
the anchor videos (ground truth of the Search subtask)
Required runs:
use of ASR 1-best transcripts as representation
*Additional runs:
use team output of Search subtask as video anchors for
Linking sub-task.
8. Search and Hyperlinking Task at MediaEval 2012
Data: blip10000
ASR Transcript:
LIMSI/Vocapia:
Confusion networks
LIUM:
1-best
Lattices
Confision networks
Video clues:
Shot boundaries (TU Berlin)
Concept-based descriptors based on a list of 589 concepts
(University of Oxford)
Face detection results (INRIA)
14. Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
Mean Reciprocal Rank (MRR):
1
RR =
RANK
Mean Generalized Average Precision (mGAP):
1
GAP = . PENALTY
RANK
15. Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
Mean Average Segment Precision (MASP):
Ranking + Length of (ir)relevant content
Segment Precision (SP[r ]) at rank r :
16. Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
Mean Average Segment Precision (MASP):
Ranking + Length of (ir)relevant content
Segment Precision (SP[r ]) at rank r :
17. Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
Mean Average Segment Precision (MASP):
Ranking + Length of (ir)relevant content
Segment Precision (SP[r ]) at rank r :
Average Segment Precision:
18. Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
Mean Average Segment Precision (MASP):
Ranking + Length of (ir)relevant content
Segment Precision (SP[r ]) at rank r :
Average Segment Precision:
N
1
ASP = . SP[r ] · rel(sr )
n
r =1
rel(sr ) = 1, if relevant content is present,
otherwise rel(sr ) = 0
19. Search and Hyperlinking Task at MediaEval 2012
Relevance Evaluation via Crowdsourcing:
Amazon MTurk HIT
24. Search and Hyperlinking Task at MediaEval 2012
Relevance Evaluation via Crowdsourcing
Details
Assessment level:
Videos
Provided segments:
Length varies across runs and participants
25. Search and Hyperlinking Task at MediaEval 2012
Relevance Evaluation via Crowdsourcing
Details
Assessment level:
Videos
Provided segments:
Length varies across runs and participants
Workers explanation of relevance judgement:
Relevant:
”Same” videos
Videos are on the same topic
Same program different topics
Irrelevant:
Same program different topics
Different topics
Different video styles: interview vs presentation
26. Search and Hyperlinking Task at MediaEval 2012
Participants
Group Search Linking
Charles University in Prague (CUNI) 5 –
Dublin City University (DCU) 4 6
Ghent University (MMLab) 4 4+4
INRIA/IRISA – 5
University of Twente (UTwente) 3 1
27. Search and Hyperlinking Task at MediaEval 2012
Participants
Group Search Linking
Charles University in Prague (CUNI) 5 –
Dublin City University (DCU) 4 6
Ghent University (MMLab) 4 4+4
INRIA/IRISA – 5
University of Twente (UTwente) 3 1
30. Search and Hyperlinking Task at MediaEval 2012
Thank you for your attention!
Participants presentations:
Search sub-task:
Maria Eskevich, Dublin City University
Petra Galuscakova, Charles University in Prague
Search and Linking sub-task:
Danish Nadeem, University of Twente
Linking sub-task:
Camille Guinaudeau, INRIA/IRISA
Shu Chen, Dublin City University
Search and Linking sub-task:
Tom de Nies, University of Chent
31. Search and Hyperlinking Task at MediaEval 2012
Thank you for your attention!
Buon compleanno Robin!
Participants presentations:
Search sub-task:
Maria Eskevich, Dublin City University
Petra Galuscakova, Charles University in Prague
Search and Linking sub-task:
Danish Nadeem, University of Twente
Linking sub-task:
Camille Guinaudeau, INRIA/IRISA
Shu Chen, Dublin City University
Search and Linking sub-task:
Tom de Nies, University of Chent