Axa Assurance Maroc - Insurer Innovation Award 2024
SEALS @ WWW2012
1. Semantic Evaluation
At Large Scale
(SEALS)
Stuart N. Wrigley, Raúl García-Castro,
Lyndon Nixon
+ The SEALS Consortium
WWW 2012
Speaker: Raúl García Lyon, France
Castro 18th April 2012
rgarcia@fi.upm.es
2. The SEALS Project (RI-238975)
http://www.seals-project.eu/
Project Coordinator: EC contribution: Duration:
Asunción Gómez Pérez 3.500.000 € June 2009-May 2012
<asun@fi.upm.es>
Universidad Politécnica de Madrid, Spain (Coordinator)
University of Sheffield, UK University of Mannheim, Germany
3 2 Forschungszentrum Informatik, Germany University of Zurich, Switzerland
1 1 2 University of Innsbruck, Austria STI International, Austria
Institut National de Recherche en Open University, UK
1 Informatique et en Automatique, France Oxford University, UK
Raúl García Castro WWW 2012. 18th April 2012 2
3. Semantic Technology Evaluation @ SEALS
SEALS Evaluation
SEALS Platform Services
SEALS Evaluation Campaigns SEALS Community
Raúl García Castro WWW 2012. 18th April 2012 3
4. The SEALS entities
Evaluations
Tools Results
Raw Results
Ontology engineering Test Data
Interpretations
Storage and reasoning
Ontology matching
Semantic search
Semantic web service
12.04.2012
4
Raúl García Castro WWW 2012. 18th April 2012 4
5. Structure of the SEALS Entities
• Java Binaries
• BPEL
• Shell scripts • Ontologies
• Java Binaries
• Bundles
Entity
Discovery, Exploitation
Metadata Data
Validation
SEALS Ontologies
http://www.seals-project.eu/ontologies/
Raúl García Castro WWW 2012. 18th April 2012 5
6. SEALS Logical Architecture
S
O Evaluation Organisers
A Technology
Providers
Technology
Adopters
SEALS Portal
Runtime
SEALS
Evaluation
Service Manager
Service
Software agents
SEALS Repositories
Test Data Tools Results Evaluation
Repository Repository Repository Descriptions
Service Service Service Repository Service
Raúl García Castro WWW 2012. 18th April 2012 6
7. Challenges
• Tool heterogeneity
- Hardware requirements Virtualization
- Software requirements as a
• Reproducibility technology
- Ensure execution environment offers enabler
the same initial status
Processing Node Execution Physical Computing
Guest Resource
Node
Virtualization Solution
• VMWare Server 2.0.2
RES Core • VMWare vSphere 4
• Amazon EC2 (In progress)
Guest
Virtual Machine
Tool
RES Worker
…
Raúl García Castro WWW 2012. 18th April 2012 7
8. SEALS Evaluation Services
Ontology Ontology Ontology Semantic Semantic web
engineering reasoning matching search service
• Conformance DL reasoning: • Matching • Search accuracy, • SWS Discovery
• Interoperability • Classification accuracy efficiency
• Scalability • Class • Matching (automated)
satisfiability accuracy • Usability,
Evaluations • Ontology multilingual satisfaction (user-
satisfiability • Scalability in-the-loop)
• Entailment (ontology size,
• Non-entailment # CPU)
• Instance
retrieval
RDF reasoning:
• Conformance
Conformance & DL reasoning: • Benchmark Automated: • OWLS-TC 4.0
interoperability: • Gardiner test • Anatomy • EvoOnt • SAWSDL-TC 3.0
• RDF(S) suite Conference • MusicBrainz • WSMO-LITE-TC
• OWL Lite, DL • Wang et al. • MultiFarm (from QALD-1)
and Full repository • Large Biomed User-in-the-loop:
Test Data • OWL 2 • Versions of (supported by • Mooney
Expressive x3 GALEN SEALS) • Moonet +
• OWL 2 Full • Ontologies
Scalability: from EU
• Real-world projects
• LUBM • Instance
• Real-world + retrieval test
• LUBM + data
RDF reasoning:
• OWL 2 Full
8
Raúl García Castro WWW 2012. 18th April 2012
9. Evaluation Campaign Methodology
SEALS
• SEALS-independent Methodology for
Evaluation
Campaigns
• Includes:
- Actors Raúl García-Castro and Stuart N. Wrigley
- Process
- Recommendations
- Alternatives
- Terms of participation
September 2011
- Use rights
INITIATION INVOLVEMENT DISSEMINATION
PREPARATION & EXECUTION FINALIZATION
Raúl García Castro WWW 2012. 18th April 2012 9
10. 1st Evaluation Campaign
Campaign Tool Provider Country 29 tools from
Ontology engineering Jena HP Labs UK
Sesame Aduna Netherlands 8 countries
Protégé 4 University of Stanford USA
Protégé OWL University of Stanford USA
NEON toolkit NEON Foundation Europe
OWL API University of Manchester UK
Reasoning HermiT University of Oxford UK
jcel Tec. Universitat Dresden Germany
FaCT++ University of Manchester UK
Matching AROMA INRIA France
ASMOV INFOTECH Soft USA
Aroma Nantes University France
Falcon-AO Southeast University China
Lily Southeast University China
RiMOM Tsinghua University China The state of
Mapso FZI Germany semantic
CODI University of Mannheim Germany technology today –
Overview of the
AgreeMaker Advances in Computing Lab USA First SEALS
Gerome* RWTH Aachen Germany Evaluation
Ef2Match Nanyang Tec. University China Campaigns
Semantic search K-Search K-Now Ltd UK Raúl García-Castro, Mikalai Yatskevich,
Cássia Trojahn dos Santos, Stuart N.
Ginseng University of Zurich Switzerland Wrigley, Liliana Cabral, Lyndon Nixon and
Ond!ej " váb-Zamazal
NLP-Reduce University of Zurich Switzerland
PowerAqua KMi, Open University UK
Jena Arq HP Labs, Talis UK
Semantic web service 4 OWLS-MX variants DFKI Germany
April 2011
Raúl García Castro WWW 2012. 18th April 2012 10
11. 2nd Evaluation Campaign
NOW
June Sept Jan May June
2011 2011 2012 2012 2012
Provide
requirements
Comment on
evaluations
and test data
Join the Evaluation Campaign!
Definition of
evaluations Run your own evaluations
and test data
Announcemen Launch of Results of Results
t at ESWC 2nd 2nd presented at
2011 Campaign Campaign ESWC 2012
Raúl García Castro WWW 2012. 18th April 2012 11
12. Community services
http://www.seals-project.eu/
• Portal • Public wiki
• Blog • Forums
• Twitter • Private wiki
• Mailing lists • Private
Dissemina- Collabora- mailing lists
tion tion
services services
Community
materials • Whitepaper
• Deliverables
• Papers
• Tutorial
Raúl García Castro WWW 2012. 18th April 2012 12
13. Evaluation services
Tools Results
Execute Evaluations Test data
evaluations My tool My results
Exploit results
Evaluations Tools My test data My results
Update them
Tools Test data
Or define My evaluation My results
your own My tool My test data
Raúl García Castro WWW 2012. 18th April 2012 13
14. Conclusions
• The SEALS Platform will be maintained 3 years after the
end of the project: June 2015 (at least)
• The SEALS Platform facilitates:
- Comparing tools under common settings
- Reproducibility of evaluations
- Reusing evaluation resources, completely or partially
- Or defining new ones
- Managing evaluation resources using platform services
- Computational resources for demanding evaluations
• Don’t start your evaluation from scratch!
12.04.2012
14
Raúl García Castro WWW 2012. 18th April 2012 14
15. Contribute to the
SEALS
Community!
Open Source development
http://www.development.seals-project.eu/seals/
User community
http://www.seals-project.eu/join-the-community
WWW 2012
Speaker: Raúl García Lyon, France
Castro 18th April 2012
rgarcia@fi.upm.es