The document describes OOPS!, a web-based tool for validating ontologies by detecting common pitfalls. OOPS! scans ontologies for 21 predefined pitfalls related to modeling issues, logical consistency, human understanding, and real world representation. It is organized with Java classes to detect pitfalls by checking ontology characteristics or looking for patterns. OOPS! provides a brief description of detected pitfalls, affected elements, and suggestions. An evaluation involved using OOPS! in research and educational projects, receiving feedback on improvements like showing undetected pitfalls and considering reasoning. The conclusions note that OOPS! advances ontology evaluation by automating more steps compared to other tools.
What Are The Drone Anti-jamming Systems Technology?
Validating ontologies with OOPS! - EKAW2012
1. VALIDATING ONTOLOGIES
WITH OOPS!
(ONTOLOGY PITFALL SCANNER!)
María Poveda, Mari Carmen Suárez-Figueroa,
Asunción Gómez-Pérez
Ontology Engineering Group. Departamento de Inteligencia Artificial.
Facultad de Informática, Universidad Politécnica de Madrid.
Campus de Montegancedo s/n.
28660 Boadilla del Monte. Madrid. Spain
{mpoveda, mcsuarez, asun}@fi.upm.es
Date: 19/10/2012
2. Table of contents
• Introduction
• State of the Art
• Pitfall Catalogue so far
• OOPS!
• User-based Evaluation
• Conclusions and Future Work
Validating Ontologies with OOPS! 2
3. Introduction
Methodologies that support the ontology development transformed the
art of building ontologies into an engineering activity.
However
Developers must tackle a wide range of difficulties and handicaps when
modelling ontologies.
These difficulties can imply the appearance of anomalies or worst
practices in ontologies.
Ontology evaluation (checking the technical quality of an ontology against
a frame of reference) is a crucial activity in ontology engineering projects.
Validating Ontologies with OOPS! 3
4. Table of contents
• Introduction
• State of the Art
• Pitfall Catalogue so far
• OOPS!
• User-based Evaluation
• Conclusions and Future Work
Validating Ontologies with OOPS! 4
5. State of the Art (i)
• Large amount on research on ontology evaluation:
• Generic evaluation framework: [Duque-Ramos, et al., 2010; Gangemi, et al., 2006;
Gómez-Pérez, 2004; Strasunskas and Tomassen, 2004]
• Final (re)use based evaluation: [Suárez-Figueroa, 2010]
• Quality models (features, criteria, and metrics): [Flemming, 2011; Burton-Jones, et al.,
2005]
• Pattern-based evaluation: [Djedidi, and Aufaure, 2010; Presutti, et al., 2008]
• Ontology evaluation is still largely neglected by developers.
• Minimal evaluation with an ontology editor: a syntax checking or reasoning test. Need for tools that
• Developers could feel overwhelmed looking for the information required by the automate as many
methods. steps as possible.
• Time-consuming and tedious nature of evaluating the quality of an ontology.
Burton-Jones, A., Storey, V.C., and Sugumaran, V., and Ahluwalia, P. A Semiotic Metrics Suite for Assessing the Quality of Ontologies. Data and Knowledge Engineering,
(55:1) 2005, pp. 84-102.
Djedidi, R., Aufaure, M.A. Onto-Evoal an Ontology Evolution Approach Guided by Pattern Modelling and Quality Evaluation, Proceedings of the Sixth International
Symposium on Foundations of Information and Knowledge Systems (FoIKS 2010), February 15-19 2010, Sofia, Bulgaria.
Duque-Ramos, A., López, U., Fernández-Breis, J. T., Stevens, R. A SQUaRE-based Quality Evaluation Framework for Ontologies. OntoQual 2010 - Workshop on
Ontology Quality at the 17th International Conference on Knowledge Engineering and Knowledge Management (EKAW 2010). ISBN: ISSN 1613-0073. CEUR Workshop
Proceedings. Pages: 13-24. 15 October 2010. Lisbon, Portugal.
Flemming, A. Assessing the quality of a Linked Data source. Proposal. http://www2.informatik.hu-berlin.de/~flemming/Proposal.pdf. 2011
Gangemi, A., Catenacci, C., Ciaramita, M., Lehmann J. Modelling Ontology Evaluation and Validation. Proceedings of the 3rd European Semantic Web Conference
(ESWC2006), number 4011 in LNCS, Budva. 2006
Gómez-Pérez, A. Ontology Evaluation. Handbook on Ontologies. S. Staab and R. Studer Editors. Springer. International Handbooks on Information Systems. Pp: 251-274.
2004. Presutti, V., Gangemi, A., David S., Aguado, G., Suárez-Figueroa, M.C., Montiel-Ponsoda, E., Poveda, M. NeOn D2.5.1: A Library of Ontology Design Patterns:
reusable solutions for collaborative design of networked ontologies. NeOn project. (FP6-27595). 2008.
Strasunskas, D., Tomassen, S.L. The role of ontology in enhancing semantic searches: the EvOQS framework and its initial validation. Int. J. Knowledge and Learning,
Vol. 4, No. 4, pp. 398-414. 2009
Suárez-Figueroa, M.C. PhD Thesis: NeOn Methodology for Building Ontology Networks: Specification, Scheduling and Reuse. Spain. Universidad Politécnica de Madrid.
June 2010.
Validating Ontologies with OOPS! 5
6. State of the Art (ii)
Do not support
OWL ontolologies
• ODEClean [Fernández-López, and Gómez-Pérez, 2002] supports OntoClean Method. Plug-in for WebODE .
• ODEval [Corcho et al., 2004] evaluates RDF(S) and DAML+OIL concept taxonomies.
• XDTools plug-in for NeOn Toolkit. Check list.
• OntoCheck plug-in for Protégé. Check list.
• MoKi [Pammer, 2010] is a wiki-based ontology editor. Check list.
• Eyeball is a command line tool also available as Java API. Experimental GUI provided. Check list.
Ontology development Installation
environment dependent process required
Fernández-López, M., Gómez-Pérez, A. The integration of OntoClean in WebODE OntoClean method. In Proceedings of the Evaluation of Ontology based Tools
Workshop EON2002 at 13th International Conference on Knowledge Engineering and Knowledge Management (EKAW02). Spain 2002.
Corcho, O., Gómez-Pérez, A., González-Cabero, R., Suárez-Figueroa, M.C. ODEval: a Tool for Evaluating RDF(S), DAML+OIL, and OWL Concept Taxonomies. In: 1st
IFIP Conference on Artificial Intelligence Applications and Innovations (AIAI 2004), August 22-27, 2004, Toulouse, France.
Pammer, V. PhD Thesis: Automatic Support for Ontology Evaluation Review of Entailed Statements and Assertional Effects for OWL Ontologies. Engineering Sciences.
Graz University of Technology
Validating Ontologies with OOPS! 6
7. Table of contents
• Introduction
• State of the Art
• Pitfall Catalogue so far
• OOPS!
• User-based Evaluation
• Conclusions and Future Work
Validating Ontologies with OOPS! 7
8. Pitfall Catalogue so far
• 29 pitfalls included in the catalogue
• 5 new pitfalls (P25-P29) added from the first version [Poveda-Villalón, et al., 2010]
• Maintenance and evolution:
• Discovered while manually analyzing ontologies
• Proposed by users (http://www.oeg-upm.net/oops/submissions.jsp)
Human understanding Modelling issues
P1. Creating polysemous elements P2. Creating synonyms as classes
P2. Creating synonyms as classes P3. Creating the relationship “is” instead of using
P7. Merging different concepts in the same class ''rdfs:subClassOf'', ''rdf:type'' or ''owl:sameAs''
P8. Missing annotations P4. Creating unconnected ontology elements
P11. Missing domain or range in properties P5. Defining wrong inverse relationships
P12. Missing equivalent properties P6. Including cycles in the hierarchy
P13. Missing inverse relationships P7. Merging different concepts in the same class
P19. Swapping intersection and union P10. Missing disjointness
P20. Misusing ontology annotations P17. Specializing too much a hierarchy
P22. Using different naming criteria in the ontology P11. Missing domain or range in properties
Logical consistency P12. Missing equivalent properties
P5. Defining wrong inverse relationships P13. Missing inverse relationships
P6. Including cycles in the hierarchy P14. Misusing ''owl:allValuesFrom''
P14. Misusing ''owl:allValuesFrom'' P15. Misusing “not some” and “some not”
P15. Misusing “not some” and “some not” P18. Specifying too much the domain or the range
P18. Specifying too much the domain or the range P19. Swapping intersection and union
P19. Swapping intersection and union P21. Using a miscellaneous class
P27. Defining wrong equivalent relationships P23. Using incorrectly ontology elements
P28. Defining wrong symmetric relationships P24. Using recursive definition
P29. Defining wrong transitive relationships P25. Defining a relationship inverse to itself
Real world representation P26. Defining inverse relationships for a symmetric one
P27. Defining wrong equivalent relationships
P9. Missing basic information
P28. Defining wrong symmetric relationships
P10. Missing disjointness
P29. Defining wrong transitive relationships
M. Poveda-Villalón, M.C. Suárez-Figueroa, A. Gómez-Pérez. A Double Classification of Common Pitfalls in Ontologies. OntoQual
2010 - Workshop on Ontology Quality at EKAW 2010. Proceedings of the Workshop on Ontology Quality - OntoQual 2010
Validating Ontologies with OOPS! 8
9. Table of contents
• Introduction
• State of the Art
• Pitfall Catalogue so far
• OOPS!
• How it is internally organized
• How it works
• User-based Evaluation
• Conclusions and Future Work
Validating Ontologies with OOPS! 9
10. OOPS! - How it is internally organized (i)
• Web-based tool
• Available at http://www.oeg-upm.net/oops
• Ontology development environment independent
• No installation process required
OOPS! Evaluation
Web User Interface results
RDF Parser
Jena API HTML
Scanner jQuery
Java EE JSP
Pitfall Scanner Warning Suggestion CSS
P2 … P29 Scanner Scanner
P1 P2 … P29
Pitfall Catalogue
Jena API: http://jena.sourceforge.net/ jQuery: http://jquery.com/
Java EE: http://www.oracle.com/technetwork/java/javaee/overview/index.html JSP: http://www.oracle.com/technetwork/java/javaee/jsp/index.html
HTML: http://www.w3.org/html/wg/ CSS: http://www.w3.org/Style/CSS/
Validating Ontologies with OOPS! 10
11. OOPS! - How it is internally organized (ii)
• 21 pitfalls implemented out of 29 included in the catalogue
• 1 Java class per pitfall implementation
• Detection automated in 2 ways:
• Checking general characteristics of the ontology
(P3, P7, P12, P20, P21, and P22). Eg: P 22. Using more than one
naming convention.
• Looking for patterns
(P2, P4, P5, P6, P8, P10, P11, P13, P19, P24, P25, P26, P27, P28, and
P5. Defining wrong inverse relationships P25. Defining a relationship inverse to itself
P29). Eg: P5: Defining wrong inverse relationships
propertyS
ClassA
<owl:ObjectProperty> <owl:inverseOf>
<owl:inverseOf> ClassB propertyS
ClassC
propertyT
P6. Including cycles in the hierarchy P28. Defining wrong symmetric relationships
ClassA ClassA ClassB
<rdfs:subClassOf>
ClassB
<rdfs:domain> <rdfs:range>
<rdfs:subClassOf>
…
<owl:SymmetricProperty>
ClassA propertyS
<rdfs:subClassOf>
Validating Ontologies with OOPS! 11
12. OOPS! - How it is internally organized (iii)
• Identifies cases where a class or property is not defined as
such by means of the corresponding OWL primitive.
• It is spotted during the execution of the “Pitfall Scanner”
module.
• Only the classes and relationships related to the other pitfalls
detection are flag up.
• Looks for properties with equal domain and
range axioms and proposes them as potential
symmetric or transitive properties.
Validating Ontologies with OOPS! 12
13. OOPS! - How it works (i)
Ontology
input area
Suggestions
& feedback
Brief Documentation
description
Related papers
Validating Ontologies with OOPS! 13
14. OOPS! - How it works (ii)
Pitfall
Pitfall name
frequency
Pitfall
description
Ontology
elements
affected
Example generated using the ontology http://data.semanticweb.org/ns/swc/swc_2009-05-09.rdf
Validating Ontologies with OOPS! 14
15. Table of contents
• Introduction
• State of the Art
• Pitfall Catalogue so far
• OOPS!
• User-based Evaluation
• Conclusions and Future Work
Validating Ontologies with OOPS! 15
16. User-based Evaluation (i)
case
Research projects: Educational projects:
Use
mIO! and Buscamedia projects ATHENS course at UPM
• 7 developers • 12 master students
Settings
• Context ontology • Introductory lesson
• Multimedia ontology • 2 already developed ontologies
• September 2011 • November 2011
a) OOPS! usefulness
b) The advantages of being IDE independent
c) The coverage of pitfalls detected in
Feedback provided
comparison with other tools. a) To include guidelines about how to solve
d) To provide a graphical user interface () pitfalls
e) To provide more information about what the b) To associate colours to the output indicating
pitfalls consist on () how critical the pitfalls are
f) To consider the imported ontologies during c) To provide a way to see the lines of the file
the analysis. () that the error considered is originated from
g) To allow the evaluation of subsets of pitfalls
h) To provide some recommendations to repair
the pitfalls found
Validating Ontologies with OOPS! 16
17. User-based Evaluation (ii)
case
Use
Users not related to any project nor experiment
Settings
• Feedback from the on-line form
• Feedback received by e-mails
a) “easy to use”, “no installation required”, “quick results” and “good to detect
low level problems in ontologies”.
b) To show which pitfalls do not appear in the ontology
Feedback provided
c) To include reasoning processes so that OOPS! would not complain when a
property inherits domain/range from its superproperty
d) To allow the evaluation of subsets of pitfalls
e) To discard pitfalls involving terms properly marked as DEPRECATED
f) To take into account different namespaces
g) To look for URI misuse as using the same URI as two types of elements
h) To look for non-standard characters in natural language annotations
Validating Ontologies with OOPS! 17
18. Table of contents
• Introduction
• State of the Art
• Pitfall Catalogue so far
• OOPS!
• User-based Evaluation
• Conclusions and Future Work
Validating Ontologies with OOPS! 18
19. Conclusions and Future Work
CONCLUSIONS
• It represents a step forward within ontology evaluation tools as :
o enlarges the list of errors detected by most recent and available works.
o is fully independent of any ontology development environment .
o works with main web browsers (Firefox, Chrome, Safari and IE).
o does not involve installation process.
OOPS!
• It is freely available to users on the Web: http://www.oeg-upm.net/oops
OntOlogy
Pitfall • Everyone can test it, provide feedback, suggest new pitfalls to be included in the
Scanner! catalogue and implemented into the tool.
o easy to use
o broadly used
• 585 executions
• 207 different ontologies
• from 14thNovember 2011 to 10th October 2012
Validating Ontologies with OOPS! 19
20. Conclusions and Future Work
FUTURE WORK
Actions Benefits
To extend the pitfall catalogue by • To increase the list of pitfalls detected by OOPS!
including new errors and pitfalls
suggested by users • To maintain and evolve the catalogue
To group pitfalls by categories and
allow users to choose (group of) pitfalls • OOPS! more flexible and adaptable to specific user needs
to check
To include guidelines about how to
• To facilitate the task of repairing the ontology after the diagnosis
solve each pitfall
To associate priority levels to each
pitfall according to their different types • To prioritize actions to be taken during the repairing task
of consequences they can convey
• To allow other developments to use and integrate the pitfall scanner
To make REST services available
functionalities within their applications
To allow pitfalls definition following a
formal language, according with • To allow users to execute OOPS! in a customized way
their particular quality criteria
To include detection of pitfalls related to • To increase the ontology evaluation dimension addressed by OOPS!
Linked Data characteristics • To increase the list of pitfalls detected by OOPS!
To improve pitfall detection
• To make OOPS! results more accurate
algorithms
Validating Ontologies with OOPS! 20
21. Questions
Any questions?
Or just drop by the demo session today from 5pm on
and try it out
http://www.oeg-upm.net/oops
Thanks!
Validating Ontologies with OOPS! 21
22. VALIDATING ONTOLOGIES
WITH OOPS!
(ONTOLOGY PITFALL SCANNER!)
María Poveda, Mari Carmen Suárez-Figueroa,
Asunción Gómez-Pérez
Ontology Engineering Group. Departamento de Inteligencia Artificial.
Facultad de Informática, Universidad Politécnica de Madrid.
Campus de Montegancedo s/n.
28660 Boadilla del Monte. Madrid. Spain
{mpoveda, mcsuarez, asun}@fi.upm.es
Date: 19/10/2012