Mais conteúdo relacionado Evaluating the conformance and interoperability of semantic technologies1. ESWC 2010 Tutorial on
Evaluation of Semantic Web Technologies
Evaluating the
conformance and interoperability
of semantic technologies
Raúl García Castro
<rgarcia@fi.upm.es>
Ontology Engineering Group
Laboratorio de Inteligencia Artificial
Facultad de Informática
Universidad Politécnica de Madrid
30th May 2010
1 © Raúl García-Castro
2. Table of contents
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Evaluating conformance
• Evaluating interoperability
• Test data
– RDF(S) Import Test Suite
– OWL Lite Import Test Suite
– OWL DL Import Test Suite
• Running the evaluations
– IBSE
– SEALS Platform
• Conclusions
Evaluating conformance and interoperability. May 30th 2010 2 © Raúl García-Castro
3. Conformance in the Semantic Web
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Conformance is the ability that semantic technologies have to adhere to
existing specifications
– In terms of ontology representation languages (RDF(S), OWL, etc.)
• Different types of conformance, regarding the ontology language:
– Knowledge model
– Serialization
– Semantics
• Conformance is a primary requirement for semantic technologies:
– Tool validation
– Feature analysis
Evaluating conformance and interoperability. May 30th 2010 3 © Raúl García-Castro
4. Conformance evaluation
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Goal: to evaluate the conformance of semantic
technologies with regards to ontology representation
languages
Tool X
O1 O1’ O1’’
Step 1: Import + Export
O1 = O1’’ + α - α’
• Applicability:
– Only requirement: that the tool is able of importing and
exporting ontologies in the ontology language
Evaluating conformance and interoperability. May 30th 2010 4 © Raúl García-Castro
4
5. Metrics
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Execution informs about the correct execution:
– OK. No execution problem
– FAIL. Some execution problem
– Platform Error (P.E.) Platform exception
• Information added or lost in terms of triples.
Oi = Oi’ + α - α’
• Conformance informs whether the ontology has
been processed correctly with no addition or loss of
information:
– SAME if Execution is OK and Information added and
Information lost are void
– DIFFERENT if Execution is OK but Information added Oi = Oi’ ?
or Information lost are not void
– NO if Execution is FAIL or P.E.
Evaluating conformance and interoperability. May 30th 2010 5 © Raúl García-Castro
6. Table of contents
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Evaluating conformance
• Evaluating interoperability
• Test data
– RDF(S) Import Test Suite
– OWL Lite Import Test Suite
– OWL DL Import Test Suite
• Running the evaluations
– IBSE
– SEALS Platform
• Conclusions
Evaluating conformance and interoperability. May 30th 2010 6 © Raúl García-Castro
7. Interoperability in the Semantic Web
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Interoperability is the ability that Semantic Web technologies have to
interchange ontologies and use them
– At the information level; not at the system level
– In terms of knowledge reuse; not information integration
• In the real world it is not feasible to use a single system or a single
formalism
• Different behaviours in interchanges between different formalisms:
disjoint disjoint
Same formalism A B A B
Different formalism
LOSSLESS
C C
subclass subclass myDisjoint myDisjoint
C A B A B
disjoint disjoint
subclass
C C
LOSS
A B subclass subclass
A B A B
Evaluating conformance and interoperability. May 30th 2010 7 © Raúl García-Castro
8. Interoperability evaluation
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Goal: to evaluate the interoperability of semantic technologies in terms
of the ability that such technologies have to interchange ontologies and
use them
Tool X Tool Y
O1 O1’ O1’’ O1’’’ O1’’’’
Step 1: Import + Export Step 2: Import + Export
O1 = O1’’ + α - α’ O1’’=O1’’’’ + β - β’
Interchange
O1 = O1’’’’ + α - α’ + β - β’
• Applicability:
– Only requirement: that the tool is able of importing and exporting ontologies
in the ontology language
Evaluating conformance and interoperability. May 30th 2010 8 © Raúl García-Castro
8
9. Metrics
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Execution informs about the correct execution:
– OK. No execution problem
– FAIL. Some execution problem
– Platform Error (P.E.) Platform exception
– Not Executed. (N.E.) Second step not executed
• Information added or lost in terms of triples.
Oi = Oi’ + α - α’
• Interchange informs whether the ontology has been
interchanged correctly with no addition or loss of
information:
– SAME if Execution is OK and Information added and
Information lost are void
– DIFFERENT if Execution is OK but Information added Oi = Oi’ ?
or Information lost are not void
– NO if Execution is FAIL, N.E., or P.E.
Evaluating conformance and interoperability. May 30th 2010 9 © Raúl García-Castro
10. Table of contents
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Evaluating conformance
• Evaluating interoperability
• Test data
– RDF(S) Import Test Suite
– OWL Lite Import Test Suite
– OWL DL Import Test Suite
• Running the evaluations
– IBSE
– SEALS Platform
• Conclusions
Evaluating conformance and interoperability. May 30th 2010 10 © Raúl García-Castro
11. General principles
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Only simple ontologies
• Only correct ontologies
• Use the RDF/XML syntax
• Small number of tests
Evaluating conformance and interoperability. May 30th 2010 11 © Raúl García-Castro
11
12. Table of contents
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Evaluating conformance
• Evaluating interoperability
• Test data
– RDF(S) Import Test Suite
– OWL Lite Import Test Suite
– OWL DL Import Test Suite
• Running the evaluations
– IBSE
– SEALS Platform
• Conclusions
Evaluating conformance and interoperability. May 30th 2010 12 © Raúl García-Castro
13. RDF(S) Import Test Suite
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Goal: To define tests for “all” the possible relations between the components of
the RDF(S) knowledge model.
rdfs:member
rdfs:seeAlso
rdfs:isDefinedBy
rdfs:value
“property” rdfs:label
rdfs:comment
rdf:rest rdfs:Resource
rdf:first
rdf:subject
rdf:predicate
rdf:type
rdf:object
rdfs:subPropertyOf rdfs:subclassOf
rdf:List rdfs:Container rdf:Statement rdf:Property rdfs:Class rdfs:Literal
rdfs:domain
rdfs:range
rdf:Bag rdf:Seq rdf:Alt rdfs:ContainerMembershipProperty rdfs:Datatype rdf:XMLLiteral
Evaluating conformance and interoperability. May 30th 2010 13 © Raúl García-Castro
14. What is a relation?
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
relation1
component1 component2
Instances of component1 can be related to instances
of component2 using the property relation1.
Example:
rdfs:label
• rdfs:domain: rdfs:Resource rdfs:label
rdfs:Resource rdfs:Literal
• rdfs:range: rdfs:Literal
But also… rdfs:Resource
subclass
rdfs:label
rdfs:Class rdfs:Literal
rdfs:Literal
subclass
rdfs:label
rdfs:Resource rdfs:XMLLiteral
Evaluating conformance and interoperability. May 30th 2010 14 © Raúl García-Castro
15. Design principles
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Define tests from the RDF(S) knowledge model
Only consider components commonly used in tools:
• Classes
• Instances
• Properties
• Literals
• Class hierarchies
• Property hierarchies
Beware of cardinalities!
rdfs:domain
rdf:Property * * rdfs:Class
Cover cardinalities of 0, 1 and 2.
Evaluating conformance and interoperability. May 30th 2010 15 © Raúl García-Castro
16. Types of tests
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
a) Import single components b) Import all the possible combinations of
two components with a property
rdfs:label rdfs:label
rdfs:comment rdfs:comment
rdf:subject rdfs:Resource rdf:subject rdfs:Resource
rdf:predicate rdf:predicate
rdf:object rdf:object
rdf:type rdf:type
rdf:Statement rdf:Property rdfs:Class rdfs:Literal rdf:Statement rdf:Property rdfs:Class rdfs:Literal
rdfs:domain rdfs:domain
rdfs:range rdfs:range
c) Import combinations of more than two d) RDF(S) graphs with the different
components that usually appear together in variants of the RDF/XML syntax
RDF(S) graphs rdfs:label
rdfs:comment
rdf:subject rdfs:Resource <rdf:Description rdf:about="#class1">
rdf:predicate
rdf:object <rdf:type rdf:resource="&rdfs;Class" />
rdf:type
</rdf:Description>
rdf:Statement rdf:Property rdfs:Class rdfs:Literal =
<rdfs:Class rdf:about="#class1">
rdfs:domain </rdfs:Class>
rdfs:range
Evaluating conformance and interoperability. May 30th 2010 16 © Raúl García-Castro
17. RDF(S) Import Test Suite
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
RDF(S) component combinations RDF/XML Syntax variants
<rdf:Description rdf:about="#class1">
<rdf:type rdf:resource="&rdfs;Class"/>
</rdf:Description>
=
<rdfs:Class rdf:about="#class1">
</rdfs:Class>
Group No. Components
Class 2 rdfs:Class
Metaclass 5 rdfs:Class, rdf:type
Subclass 5 rdfs:Class, rdfs:subClassOf
Class and property 6 rdfs:Class, rdf:Property, rdfs:Literal
Property 2 rdf:Property
Subproperty 5 rdf:Property, rdfs:subPropertyOf
Property with 24 rdfs:Class, rdf:Property, rdfs:Literal,
domain and range rdfs:domain, rdfs:range
Instance 4 rdfs:Class, rdf:type
Instance and 14 rdfs:Class, rdf:type, rdf:Property,
property rdfs:Literal
Syntax and 15 rdfs:Class, rdf:type, rdf:Property,
abbreviation rdfs:Literal
TOTAL
Evaluating conformance and82
interoperability. May 30th 2010 17 © Raúl García-Castro
18. Table of contents
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Evaluating conformance
• Evaluating interoperability
• Test data
– RDF(S) Import Test Suite
– OWL Lite Import Test Suite
– OWL DL Import Test Suite
• Running the evaluations
– IBSE
– SEALS Platform
• Conclusions and future work
Evaluating conformance and interoperability. May 30th 2010 18 © Raúl García-Castro
19. Design principles
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Define tests from the OWL (Lite) Abstract Syntax
Example:
Class
descrip0ons
axiom ::= 'Class(' classID ['Deprecated'] modality { annotation } { super } ')'
modality ::= 'complete' | 'partial'
super ::= classID | restriction
axiom ::= 'EquivalentClasses(' classID classID { classID } ')'
axiom ::= 'Datatype(' datatypeID ['Deprecated'] { annotation } )'
Cover all the productions and symbols
super ::= classID | restriction • super ::= class01
• super ::= restriction
Limit the number of tests
We cover cardinalities of 0, 1 and 2.
axiom ::= 'EquivalentClasses(' classID classID { classID } ')’
axiom ::= 'EquivalentClasses(' classID classID ')’
Evaluating conformance and interoperability. May 30th 2010 19 © Raúl García-Castro
20. OWL Lite Import Test Suite
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Component combinations RDF/XML Syntax variants
<rdf:Description rdf:about="#class1">
<rdf:type rdf:resource="&rdfs;Class"/>
</rdf:Description>
=
<rdfs:Class rdf:about="#class1">
</rdfs:Class>
Group No. Subclass of restriction
Subclass of class
Class hierarchies 17
Value constraints Set operators
Class equivalences 12
Classes defined with set operators 2
Property hierarchies 4
Properties with domain and range 10
Cardinality + Cardinality +
Relations between properties 3 object property datatype property
Global cardinality constraints and 5
logical property characteristics
Single individuals 3
Named individuals and properties 5
Anonymous individuals and properties 3
Individual identity 3
Syntax and abbreviation 15
TOTAL 82
Evaluating conformance and interoperability. May 30th 2010 20 © Raúl García-Castro
21. Table of contents
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Evaluating conformance
• Evaluating interoperability
• Test data
– RDF(S) Import Test Suite
– OWL Lite Import Test Suite
– OWL DL Import Test Suite
• Running the evaluations
– IBSE
– SEALS Platform
• Conclusions
Evaluating conformance and interoperability. May 30th 2010 21 © Raúl García-Castro
22. Design principles
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Define tests from the OWL (DL) Abstract Syntax
Cover all the productions and symbols
Limit the number of tests
Increase exhaustiveness
To maximize the coverage of the knowledge model.
Put user in the loop
Defining tests should be:
• Simple
• Extensible
• Parameterized
Evaluating conformance and interoperability. May 30th 2010 22 © Raúl García-Castro
23. Keyword-based test generator
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Macro and
Test
Definitions Interpreter
(CSV file)
Test Suite
Metadata
ontology01.owl
Keyword
executor ontology02.owl
ontology03.owl
…
Evaluating conformance and interoperability. May 30th 2010 23 © Raúl García-Castro
24. Parameterize generation
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Macro and
Test Test
Generator Definitions Interpreter
(CSV file)
• Examples:
– “…for every type of class description”
– “…using all the built-in annotation properties”
– “…starting from a depth of 500 and to a depth of 5.000”
– …
Evaluating conformance and interoperability. May 30th 2010 24 © Raúl García-Castro
25. Extracting keywords
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Example:
Class
descrip0ons
description ::= classID
| restriction
| 'unionOf(' { description } ')'
| 'intersectionOf(' { description } ')'
| 'complementOf(' description ')'
| 'oneOf(' { individualID } ')'
restriction ::= 'restriction(' datavaluedPropertyID dataRestrictionComponent { dataRestrictionComponent }
')'
| 'restriction(' individualvaluedPropertyID individualRestrictionComponent
{ individualRestrictionComponent } ')’
Keyword Parameter1 Parameter2 Parameter3 Parameter4
createNamedClass resultId className
createClassEnumerated resultId origClassId individualId1 individualId2
createClassAllValuesFromRestriction resultId origClassId propertyId classId
createClassSomeValuesFromRestriction resultId origClassId propertyId classId
createClassHasValueRestriction resultId origClassId propertyId value
createClassCardinalityRestriction resultId origClassId propertyId cardinality
createClassMinCardinalityRestriction resultId origClassId propertyId cardinality
createClassMaxCardinalityRestriction resultId origClassId propertyId cardinality
createClassIntersection resultId origClassId classId1 classId2
createClassUnion resultId origClassId classId1 classId2
Evaluating conformance and interoperability. May 30th 2010 25 © Raúl García-Castro
createClassComplement resultId origClassId classId
26. Defining macros
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
MACRO:
createObjectPropertyDomainAndRange descriptionId propertyId classId1 classId2
Defini0on:
createObjectProperty descriptionId propertyId
addPropertyDomain descriptionId classId1
addPropertyRange descriptionId classId2
Benefits:
• Easily build new tests
• Define complex patterns
createNamedClassWithLabel descriptionId classId
createNamedClass descriptionId classId
addAnnotationLiteral descriptionId rdfs:label classId@en
Evaluating conformance and interoperability. May 30th 2010 26 © Raúl García-Castro
27. OWL DL Import Test Suite
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Three types of tests:
– Simple combinations of components
• Class | property | individual descriptions
• Class | property | individual axioms
• Property characteristics
• Data ranges
• Annotation properties
– Combinations of components that usually appear together
• Properties with domain and range
• Individuals and properties
– Restrictions in the use of components
• Cardinalities greater than 1 561 test
• Class descriptions as object
• Class descriptions as subject cases!
Evaluating conformance and interoperability. May 30th 2010 27 © Raúl García-Castro
28. Table of contents
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Evaluating conformance
• Evaluating interoperability
• Test data
– RDF(S) Import Test Suite
– OWL Lite Import Test Suite
– OWL DL Import Test Suite
• Running the evaluations
– IBSE
– SEALS Platform
• Conclusions
Evaluating conformance and interoperability. May 30th 2010 28 © Raúl García-Castro
29. Test and result representation
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Test Suite ontology rdfs:domain
hasAuthor
rdfs:range Legend:
– Conformance Test Suite ontology
TestSuite
hasVersion
rdfs:range
– Interoperability Test Suite ontology
belongsTo
rdfs:domain
rdfs:domain rdfs:range
Test hasId
xsd:string
rdfs:domain rdfs:range
isLocatedAtURL
hasOntologyName
OntologyDocument
hasOntologyNamespace
hasRepresentationLanguage
• Test Output ontology
– Conformance Test Output ontology
– Interoperability Test Output ontology
TestSuite
rdfs:domain rdfs:range
rdfs:subClass
ConformanceTestSuite coversOntologyLanguage
rdfs:range
xsd:string
Test
belongsToConformanceTS
rdfs:domain
rdfs:domain
coversOntologyLanguageFeature
rdfs:subClass
ConformanceTest rdfs:range
usesOntologyDocument
OntologyDocument
Evaluating conformance and interoperability. May 30th 2010 29 © Raúl García-Castro
30. The IBSE tool
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Test descriptions
benchmarkOntology
OWL Lite Reports
Import Test (HTML, SVG)
Suite rdf:type
<rdf:RDF
Execution results
xmlns:rdf="http://www.w3.org/
<rdf:RDF
1999/02/22-rdf-syntax-ns#"
xmlns:rdf="http://www.w3.org/
<rdf:RDF
xmlns:rdfs="http://www.w3.org/
1999/02/22-rdf-syntax-ns#"
1 Describe
xmlns:rdf="http://www.w3.org/
2000/01/rdf-schema#"
xmlns:rdfs="http://www.w3.org/
1999/02/22-rdf-syntax-ns#"
xmlns:owl="http://www.w3.org/
2000/01/rdf-schema#"
xmlns:rdfs="http://www.w3.org/
2002/07/owl#" xmlns:xsd="http://
2000/01/rdf-schema#"
resultOntology
www.w3.org/2001/XMLSchema#"
xmlns:owl="http://www.w3.org/
arkOntology#" arkOntology#">
2002/07/owl#"
<owl:Ontology rescription of the
benchmark suite inputs.</
tests
rdfs:comment>
<owl:versionInfo>24 October
2006</owl:versionInfo>
</owl:Ontology>
<!-- classes -->
rdf:type
Tools <rdf:RDF
xmlns:rdf="http://www.w3.org/
<rdf:RDF
1999/02/22-rdf-syntax-ns#"
xmlns:rdf="http://www.w3.org/
<rdf:RDF
xmlns:rdfs="http://www.w3.org/
1999/02/22-rdf-syntax-ns#"
xmlns:rdf="http://www.w3.org/
2000/01/rdf-schema#"
xmlns:rdfs="http://www.w3.org/
1999/02/22-rdf-syntax-ns#"
xmlns:owl="http://www.w3.org/
2000/01/rdf-schema#"
xmlns:rdfs="http://www.w3.org/
2002/07/owl#" xmlns:xsd="http://
2000/01/rdf-schema#"
2 Execute 3 Generate
www.w3.org/2001/XMLSchema#"
xmlns:owl="http://www.w3.org/
arkOntology#" arkOntology#">
2002/07/owl#"
<owl:Ontology rescription of the
benchmark suite inputs.</
rdfs:comment>
…
<owl:versionInfo>24 October
2006</owl:versionInfo>
tests reports
</owl:Ontology>
<!-- classes -->
• Automatically executes tests between all the tools
• Allows configuring different execution parameters
• Uses ontologies to represent tests and results
• Depends on external ontology comparers (Jena + Pellet and RDF-utils)
http://knowledgeweb.semanticweb.org/benchmarking_interoperability/ibse/
Evaluating conformance and interoperability. May 30th 2010 30 © Raúl García-Castro
31. The SEALS Platform
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
SEALS Portal
Entity
Evaluation
management
requests
requests
Runtime
SEALS
Evaluation
Service Manager
Service
SEALS Repositories
Test Data Tools Results Evaluation
Repository Repository Repository Descriptions
Service Service Service Repository Service
Evaluating conformance and interoperability. May 30th 2010 31 © Raúl García-Castro
32. Table of contents
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• Evaluating conformance
• Evaluating interoperability
• Test data
– RDF(S) Import Test Suite
– OWL Lite Import Test Suite
– OWL DL Import Test Suite
• Running the evaluations
– IBSE
– SEALS Platform
• Conclusions
Evaluating conformance and interoperability. May 30th 2010 32 © Raúl García-Castro
33. Are there any results?
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
• RDF(S) Interoperability Benchmarking
http://knowledgeweb.semanticweb.org/iriba/ IRIBA
• OWL (Lite) Interoperability Benchmarking
SemTalk
(Frames) (OWL)
http://knowledgeweb.semanticweb.org/benchmarking_interoperability/owl/2008-07-06_Results/
• Results:
– Per tool
– Global
– Evolution over time
• Summary:
http://fusion.cs.uni-jena.de/professur/research/activities/docs/ESWC09%20Tutorial%20-%2002%20Interoperability.pdf
Evaluating conformance and interoperability. May 30th 2010 33 © Raúl García-Castro
34. Conclusions
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
Methods for evaluating conformance and interoperability
• Common for different semantic technologies
• Problem-focused instead of tool-focused
• Provides data about other characteristics (e.g., robustness)
Resources for evaluating conformance and interoperability
• All the test suites, software and results are publicly available
• Independent of:
– The interchange language
– The input ontologies
Keyword-based test definition + Automatic test execution
• Affordable for evaluators (end users, developers, etc.)
• Test definition at large scale
• Need effective tests, which requires effort
• Result analysis is still hard
Evaluating conformance and interoperability. May 30th 2010 34 © Raúl García-Castro
35. SEALS Yardsticks for Ontology Management
http://www.seals-project.eu/seals-evaluation-campaigns/ontology-engineering-tools
CONFORMANCE INTEROPERABILITY TEST DATA RUNNING CONCLUSIONS
3 evaluation scenarios:
• OET Conformance 2010
• OET Interoperability 2010
• OET Scalability 2010
Join the
5 evaluation datasets
• RDF(S) Import Test Suite evaluation
• OWL Lite Import Test Suite campaign!
• OWL DL Import Test Suite
• OWL Full Import Test Suite
• Scalability Test Suite
Timeline:
• May 2010: Registration opens
• May-June 2010: Evaluation materials and documentation are provided to participants
• July 2010: Participants upload their tools
• August 2010: Evaluation scenarios are executed
• September 2010: Evaluation results are analysed
• November 2010: Evaluation results are discussed in a workshop
Evaluating conformance and interoperability. May 30th 2010 35 © Raúl García-Castro