The Speaking Objects Vision: Argumentation for Coordinating Distributed Systems (part 2)
12 de Dec de 2017•0 gostou
1 gostaram
Seja o primeiro a gostar disto
mostrar mais
•139 visualizações
visualizações
Vistos totais
0
No Slideshare
0
De incorporações
0
Número de incorporações
0
Baixar para ler offline
Denunciar
Ciências
What is argumentation
Definition
Frameworks
Dialogue types (recap)
Argumentation use cases
Analysis
Synthesis
Applications
Argumentation for Speaking Objects
Benefits
Challenges
Ricercatore RTD-A / Fixed-term research assistant (post-doc) presso Università degli Studi di Modena e Reggio Emilia em Università degli Studi di Modena e Reggio Emilia
The Speaking Objects Vision: Argumentation for Coordinating Distributed Systems (part 2)
The Speaking Objects Vision
Argumentation for Coordinating Distributed Systems
Part II: On Argumentation
Prof. Stefano Mariani
(stefano.mariani@unimore.it)
2
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Argumentation use cases
• Analysis
• Synthesis
• Applications
• Argumentation for Speaking Objects
• Benefits
• Challenges
3
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Argumentation use cases
• Analysis
• Synthesis
• Applications
• Argumentation for Speaking Objects
• Benefits
• Challenges
4
What is an argument?
• Many admissible definitions depending on the goal and
the application domain
• Intuitively
• it is a relationship between propositions
• it is invoked by linguistic action (in either monologue or dialogue)
• it results from an appropriate speaker/writer intention
• it is an instance of one of many different types
5
What does it mean to argue?
• Arguing is something people do (rather than something
that propositions do)
• Arguing typically involves two or more people (not
monologue, but dialogue or polylogue)
• Arguing typically involves two or more points of view (not
monolectical, but dialectical)
• Arguing concerns advancing arguments
• The process of arguing is governed by rules
6
Supporting and attacking
• One consequence of the dialectical nature of arguing is
that arguments attack one another
• Attacks can be of two types [Pollock, 1987]:
• rebutting = claim negates the rebutted claim
• undercutting = claim negates a claim supporting the undercut
claim
7
Argumentation
• Abstract argumentation
• encapsulates arguments as nodes in a network
• connects them through a relationship of attack
• defines a “calculus of opposition” for determining what is
acceptable
• Structured argumentation
• opens up the encapsulation
• arguments as claims supported by premises
• support relation between arguments
• Semantics formalize different intuitions about how to solve conflicts
and how to pick acceptable arguments
• Nonmonotonic: new argument may throw out what was accepted
8
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Argumentation use cases
• Analysis
• Synthesis
• Applications
• Argumentation for Speaking Objects
• Benefits
• Challenges
9
A few definitions (1)
• An argumentation framework (AF) is a pair (A, R)
where
• A is a set of arguments
• R ⊆ A × A is a relation representing “attacks” (or, “defeats”)
10
A few definitions (2)
• Given an AF F = (A,R)
• set S ⊆ A is conflict-free in F, if, for each a,b ∈ S, (a,b) ∉ R
11
A few definitions (3)
• Given an AF F = (A,R)
• a set S ⊆ A is admissible in F, if S is conflict-free in F
• each a ∈ S is defended by S in F
• a ∈ A is defended by S in F, if for each b ∈ A with (b,a) ∈ R,
there exists a c ∈ S, such that (c, b) ∈ R
• Semantics: no undefended attacked arguments
12
A few definitions (4)
• Given an AF F = (A,R)
• a set S ⊆ A is complete in F, if S is admissible in F
• each a ∈ A defended by S in F is contained in S
• Semantics: all defended arguments
13
On semantics
• Semantics map AFs to a collection of sets of arguments
• grounded: (1) accept unattacked args, (2) delete args attacked
by accepted args, (3) goto 1, stop when fixpoint reached
• preferred: maximal conflict-free sets attacking all their attackers
• stable: conflict free sets attacking all unaccepted args
14
A look at the literature
• Either abstract or structured, many different variations (regarding semantics,
mostly)
• logic-based = arguments as logic formulae, acceptability as satisfiability
• value-based = values attached to relations
• assumption-based = arguments become rules, the argumentation graph becomes a
tree
• Among the most notable ones
• [Toulmin, 1969] (seminal structured framework)
• [Simari and Loui, 1992]
• [Dung, 1995] (seminal abstract framework)
• [Walton and Krabbe, 1995] (simple yet effective)
• [Krause et al., 1995]
• [Prakken and Sartor, 1996]
• [Kraus et al., 1998]
• [Amgoud and Cayrol, 2002]
• [Gordon et al., 2007]
• [Prakken, 2010]
15
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Argumentation use cases
• Analysis
• Synthesis
• Applications
• Argumentation for Speaking Objects
• Benefits
• Challenges
16
Dialogue types (1)
• Walton and Krabbe [Walton and Krabbe, 1995] propose a
classification of conversations, following a pragmatic approach
based on
• the situation motivating the conversation session
• the goal of the initiating participant
• the goal of the dialogue itself
• 7 dialogue types:
• Information seeking [do you know this?]
• Inquiry [do somebody know that?]
• Discovery [what do we know?]
• Persuasion [you should do…]
• Negotiation [we should do…]
• Deliberation [let’s do…]
• Eristic [f#@k you!]
17
Dialogue types (2)
• Information Seeking
• initiators need some information and they believe to know who
amongst the other participants could provide it
• Inquiry
• initiators and responders collaborate in collectively answering
questions meant to acquire information which no one believes
some other participant individually knows
• Discovery
• similar to inquiry but in the latter, the information sought for is
agreed upon at the beginning of the dialogue, whereas in the
former it emerges during the course of the dialogue itself
18
Dialogue types (3)
• Persuasion
• the initiator wants to convince a participant to believe in an opinion,
to adopt a standpoint, to accept a fact (s)he does not currently hold
• the responder has a conflicting belief, and may not share the
initiator’s goal of persuading the other participant
• Negotiation
• participants bargain over an issue, i.e., the division of some scarse
resource, each pursuing a goal which is not necessarily aligned with
the goal of the collective, nor of the dialogue session
• a negotiation session converges when a resolution of the issue
which is acceptable by all participants is found, fails otherwise
• Deliberation
• a decision should be collectively taken about what course of actions
to adopt given a situation
19
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Example
• Argumentation use cases
• Analysis
• Synthesis
• Applications
• Argumentation for Speaking Objects
• Benefits
• Challenges
20
Assisted living scenario
• Imagine a smart home equipped with an array of speaking
and hearing objects (sensors and actuators) meant to
support inhabitants in their everyday life activities to ensure
safety and promote healthy ageing
• cameras, smoke detectors, the A/C system, automatic windows and
curtains, self-driving wheel-chairs, the inhabitants’ smartphones,
smart TVs, RFIDs, and wearable medical devices
• Consider that one of the inhabitants, Walter, is suffering of
Chronic Obstructive Pulmonary Disease (COPD), thus he
may undergo unexpected respiratory/asthma crisis with
different degrees of severity, up to requiring immediate
hospitalization
• Assume now that one of those crisis happens when the
elderly person is temporarily home alone
21
Step 1: situation assessment
• To disambiguate the situation, an information seeking
dialogue is required, which involves, for instance, the
smartphone S, the wristband W monitoring heartbeat,
and the indoor surveillance cameras Ci, with i ∈ 1, . . . , N
(denoted as a collective):
• S: Everybody listen, I’m detecting a clear coughing sound,
should we worry?
• W: Probably yes, since the heart beat
is accelerating steadily!
• Ci: Wait a minute, Walter was just eating peanuts a moment ago,
could it be food choking instead?
• Cj: I can confirm, he is now punching his own breast and trying to
puke.
22
Step 2: action planning
• Whichever the cause behind the coughing episode just detected
is, an appropriate action has to be taken to help Walter
• This naturally leads towards a deliberation dialogue, which the
smartphone, the cameras (also external ones, collectively
denoted as Cek, k ∈ 1, . . . , M), Walter’s smart watch Wa, and
his self-driving wheel-chair Ch may participate to:
• S: Status report, please.
• Cj: He is not puking yet.
• S: I text Linda [Walter’s daughter] to get here.
• Wa: You may try, but she’s at the dentist now thus she may not come in
time!
• Ch: Go for it, I’ll take Walter out in the meanwhile.
• Cek: Agreed, there’s a neighbour mowing the lawn just across the road.
• S: I’ll also call the ambulance.
23
From dialogues to argument graphs (1)
Walter is
coughing
Walter has
respiratory
crisis
Walter
suffers of
COPD Cough is
symptom
of COPD
Heartbeat
accelerating
Walter has
respiratory
crisis
Walter
suffers of
COPD
Fast heartbeat
is symptom
of COPD
Walter is
coughing
Walter is
food
choking
Walter
just ate
peanuts
Peanuts
frequent
cause of
food choking
Walter is
coughing Walter is
food
choking
Walter
is punching
his breast
Walter is
trying to
puke
Punching
breast may
avoid
food choking
Puking
may avoid
food choking
• Despite the apparent simplicity of the dialogue, having just 4
utterances, it conveys 14 argument components and 11 relations (all
of kind support)
24
• Claims are supported by two distinct kinds of premises:
• observations of the current situation, such as the fact that Walter
is coughing, or that his heartbeat is accelerating
• background knowledge, such as the evidence that cough and
fast heartbeat are symptoms of COPD
• Weights or preference values could be attached to
relations so as to unambiguously and automatically
decide who is winning the debate
• for instance, in next picture we naively decorate all support
relations with the same weight (equal to 1), and…
• …attach a weight to attacks simply computed as the sum of the
weights of supporting premises
From dialogues to argument graphs (2)
25
From dialogues to argument graphs (3)
Walter is
food
chokingWalter
is punching
his breast
Walter is
trying to
puke
Walter has
respiratory
crisis
Walter is
coughing
Heartbeat
accelerating
Walter
just ate
peanuts 1
1
1
1
3
2
1
a b
Walter is
healthy
11
c
Walter
has been
running
1
1 1
• Weights (or preference values, or any other quantitative label)
may be adjusted dynamically, as time flows or novel
information is acquired
26
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Example
• Argumentation in action
• Analysis
• Synthesis
• Applications
• Argumentation for Speaking Objects
• Benefits
• Challenges
27
• Analyzing discourse on the pragmatic level and
applying a certain argumentation theory to model and
automatically analyze the data at hand
• Discourse = goes beyond sentence
• Pragmatics = considers the function of the language
• Argumentation theory = provides the theoretical foundation
• among which, the argumentation framework
Argument mining: definition
31
• Conclusion and premise [Palau and Moens, 2009]
• Major claim, claim and premise [Stab and Gurevych,
2014]
• Claim, premise, backing, rebuttal and refutation [Habernal
and Gurevych, 2016]
• Different types of claims: support, oppose, propose
[Kwon et al., 2007]
• Different types of evidence: study, expert and anecdotal
[Rinott et al., 2015]
Argument mining: component types
32
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Example
• Argumentation in action
• Analysis
• Synthesis
• Applications
• Argumentation for Speaking Objects
• Benefits
• Challenges
33
• Consider the unmanned aerial vehicle (UAV) domain
• a human operator monitors a UAV, which functions at various
different levels of autonomy
• the UAV has an implicit consent and the operators intervene only
when they disagree with the UAV’s decision
• operators require a detailed understanding of why a plan should
be executed, and what its effects will be in order for a mission to
be successfully executed
• it is also important to allow operators to explore what alternatives
were considered and to be able to update possibly obsolete
information
Natural Language Generation (NLG) (1)
36
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Example
• Argumentation in action
• Analysis
• Synthesis
• Applications
• Argumentation for Speaking Objects
• Benefits
• Challenges
37
• Education – enhance persuasive essay writing
• Legal – enhance preparation for court
• Finance – swiftly understand pros/cons of an examined
investment
• Healthcare – understand pros/cons of a suggested
prognosis
• Journalism – quickly dive into a new topic, by reviewing
pros and cons
• Politics – enhance interaction of citizens with
government representatives
Many potential application domains
38
• Problem statement: given a serch topic, swiftly generate
quality content enabling to take more informed decisions
• Approach: model the content-generation process
through argumentation and provide the user with most
relevant content at the right timing
• For instance: “shall smoking be banned?”
Example: beyond search engines (1)
41
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Example
• Argumentation use cases
• Argumentation for Speaking Objects
• Benefits
• Challenges
42
Interpretability
• A network of speaking and hearing objects acts as a sort of gray-
box
• low-level data is processed by machine learning, computer
vision, and signal processing tools, often in the form of black-
box models (as it happens for deep networks)
• high-level granules of information that are generated are
amenable of interpretation by both humans and other agents
• Argumentation, in fact, encourages interacting agents to disclose
information and the strategy behind their “negotiation moves” so
as to persuade the other parties and reach a deal
• for instance, auction-based negotiation is just about exchanging subjective
values attributed by agents to the object of the negotiation
• with argumentation, smart devices will be capable of explaining their
behaviour and motivating their choices and decisions [Caminada et al.,
2014]
43
Dealing with uncertainty
• Knowledge about the state of the world usually comes in two
forms
• background knowledge = i.e. smoking causes cancer
• contextual knowledge = i.e. Walter is smoking
• Background knowledge could be continuously enriched and
refined thanks to the learnt experience reported by speaking and
hearing objects
• weights attached to edges could be obtained with a dynamic
function that changes over time and that is tuned according
to the knowledge of previous situations
• The opportunity here is to advance beyond the state of art in
situation recognition, especially in those scenarios where
uncertainty of information is the norm [Krause et al., 1995]—such
as the IoT, indeed
44
Adaptiveness
• Coordination protocols and rules may arise by emergence
from the argumentative interactions of the participants
• This enables the system as a whole as well as individual
agents to adapt to the ever-changing goals, constraints,
and unexpected contingencies arising during operation, by
seamlessly adjusting the coordination protocols upon need
[Rahwan et al., 2003], [Parsons and McBurney, 2003]
• Furthermore, this adaptation capability is embedded in the
coordination process enacted by agents through
argumentation
• thus it does not need additional mechanisms to monitor the
system behaviour, plan adaptation actions, and then execute them
45
Robustness
• Argumentation may enhance robustness of the system
w.r.t. errors and malicious behaviour, for instance
• regarding the former, conflicts in sensors’ perceptions can be
solved by argumenting for achieving consensus
• in the latter case, false or harmful claims made by attacker
agents may be effectively challenged by “good” ones, fact-
checking suspected claims so as to detect anomalies and act
accordingly
46
Trust
• When taking into account humans-in-the-loop, argumentation
has the great advantage of promoting trustability of a system
• if users can get justifications about the decision making undergoing
“behind the scenes”, about why a system is pursuing a given course
of actions, and how it came up with a precise conclusion about the
state of the world, they are likely to increase their confidence in
relying on the autonomous capabilities of the system
• We hereby remark that striving to provide trustability and
interpretability is necessary to guarantee accountability of
systems and decision making
• an increasingly hot topic in many fields of AI – from big data [:20,
2016] to algorithms in general [Medsker, 2017]
• as witnessed by the recent transparency initiative endorsed by
many organisations worldwide (http://www.transparency-
initiative.org/)
47
Outline
• What is argumentation
• Definition
• Frameworks
• Dialogue types (recap)
• Example
• Argumentation use cases
• Argumentation for Speaking Objects
• Benefits
• Challenges
48
Consensus on Argumentation Framework
• Argumentation requires that all participants in a debate to be resolved
automatically either agree on the argumentation-based framework
deciding who wins the dispute, or abide by a well respected authority
acting as dispute arbiter, for instance
• in the traffic management scenario (from part 1) implementing the latter solution
is straightforward since the traffic light(s) may naturally play the role of the
arbiter deciding over arguing cars
• whereas in the assisted living one it appears less obvious who could be the
arbiter, thus a more decentralised solution where sensors, actuators, and
personal devices share consensus upon the argumentation rules may be
preferred
• both solutions have some cons, such as being vulnerable to a single point of
failure in the former case, or requiring additional communications to dynamically
reach consensus in the latter
• Either the acknowledged arbiter, or the agreed upon argumentation
rules themselves, should also act as the “guardian” authority enforcing
mission-critical safety constraints, as well as ethical compliance
49
Feasibility & performance
• Recent hardware advancements allow to embed a fair amount of
computational power in sensor and actuator devices as well as in single
chips [D. Lane et al., 2017]
• In parallel, machine learning techniques are also steadily improving, so
that incorporating “intelligence” within smart objects is already a feasible
reality [Bourzac, 2017], [Radu et al., 2016]
• Then, a certain minimum performance is also required to ensure that
short dialogues are decided nearly in real-time, thus arguments
exchange as well as inner argumentation-based reasoning must be
extremely fast
• challenging task which partly requires low latency MOM and partly extremely
efficient reasoning engines
• Akka framework = actor-based distributed programming (http://akka.io/)
• Apache ActiveMq Real Time (http://activemq.apache.org/activemq-real-
time.html) = fork of Apache flagship IoT-focussed MOM ActiveMQ
• MiniMe [Scioscia et al., 2014] = ex-tremely lightweight reasoning engine
conceived especially for Semantic Web of Things (SWoT) applications
50
Complexity & tractability
• State explosion problem induced by expanding utterances and
dialogues into arguments and relations, which are possibly further
expanded in claims and premises
• as shown in our “assisted living” example, a simple dialogue made up of 4
utterances can be easily decomposed in 20+ between arguments and
relations!
• Anyway, not all the participants to the debate may argue at the
same level of abstraction
• the smartphone is not at all interested in the premises supporting claims, but
only in knowing which argument between “Walter is food choking” and
“Walter has respiratory crisis” is stronger
• similarly, the cameras may be totally unaware of which claim their arguments
are supporting to actively cooperate to the dispute resolution process
• There is no need to represent and reason about the whole
argumentation graph in participant devices, as they are usually only
concerned with a portion of it
51
Automatic argumentation and
Natural language generation
• When interacting with human users, the automation of the
process that produces argumentation graphs, starting from
natural language dialogues, is clearly a highly chal- lenging
task
• Many technologies have to be combined and exploited to
reach such an ambitious goal
• argumentation mining techniques [Lippi and Torroni, 2016] have
to be employed for the automatic detection of argument
components within dialogue utterances, as well as for the synthesis
and generation of new arguments
• machine learning tools are already widely employed in
argumentation mining, both for the detection of argument
components in text, and for the recognition of attack/support
relationships between arguments, or part thereof [Lippi and Torroni,
2016]
52
Knowledge representation
• Commonsense reasoning has to be distilled within
speaking and hearing objects, so as to exploit
background knowledge stored in the form of ontologies,
or logic facts and rules
• Ontology alignment [Keskes and Rahmoun, 2016] is an
issue which demands for effective and efficient solutions
such as the one based on similarity-based clustering
proposed in [Garruzzo and Rosaci, 2007]
• there, the problem of heterogeneous agents adopting different
ontologies in an open MAS is dealt with by enabling agents to
learn new terms by letting other agents explain them, and
negotiating about their intended semantics
54
References
• [Pollock, 1987] Pollock, J. L. (1987). Defeasible reasoning. Cognitive Science, 11(4):481–518.
• [Walton and Krabbe, 1995] Walton, D. and Krabbe, E. (1995). Commitment in Dialogue: Basic concept of interpersonal reasoning. State University
of New York Press, Albany NY.
• [Palau and Moens, 2009] Palau, R. M. and Moens, M.-F. (2009). Argumentation mining: The detection, classification and structure of arguments in
text. In Proceedings of the 12th International Conference on Artificial Intelligence and Law, ICAIL ’09, pages 98–107, New York, NY, USA. ACM.
• [Stab and Gurevych, 2014] Stab, C. and Gurevych, I. (2014). Identifying argumentative discourse structures in persuasive essays. In EMNLP, pages
46–56.
• [Habernal and Gurevych, 2016] Habernal, I. and Gurevych, I. (2016). Argumentation mining in user- generated web discourse. CoRR, abs/
1601.02403.
• [Kwon et al., 2007] Kwon, N., Zhou, L., Hovy, E., and Shulman, S. W. (2007). Identifying and classifying subjective claims. In Proceedings of the 8th
Annual International Conference on Digital Government Research: Bridging Disciplines & Domains, dg.o ’07, pages 76–81. Digital Government
Society of North America.
• [Rinott et al., 2015] Rinott, R., Dankin, L., Perez, C. A., Khapra, M. M., Aharoni, E., and Slonim, N. (2015). Show me your evidence - an automatic
method for context dependent evidence detection. In M`arquez, L., Callison-Burch, C., Su, J., Pighin, D., and Marton, Y., editors, Proceedings of the
2015 Conference on Empirical Methods in Natural Language Processing, EMNLP 2015, Lisbon, Portugal, September 17-21, 2015, pages 440–450.
The Association for Computational Linguistics.
• [Caminada et al., 2014] Caminada, M. W., Kutlak, R., Oren, N., and Vasconcelos, W. W. (2014). Scrutable plan enactment via argumentation and
natural language generation. In Proceedings of the 2014 International Conference on Autonomous Agents and Multi-agent Systems, AAMAS ’14,
pages 1625–1626, Richland, SC. International Foundation for Autonomous Agents and Multiagent Systems.
• [Krause et al., 1995] Krause, P., Ambler, S., Elvang-Goransson, M., and Fox, J. (1995). A logic of argumen- tation for reasoning under uncertainty.
Computational Intelligence, 11(1):113–131.
• [Rahwan et al., 2003] Rahwan, I., Ramchurn, S. D., Jennings, N. R., Mcburney, P., Parsons, S., and Sonenberg, L. (2003). Argumentation-based
negotiation. Knowl. Eng. Rev., 18(4):343–375.
• [Parsons and McBurney, 2003] Parsons, S. and McBurney, P. (2003). Argumentation-based dialogues for agent co-ordination. Group Decision and
Negotiation, 12(5):415–439.
• [:20, 2016] (2016). More accountability for big-data algorithms. Nature, 537(7621):449–449.
• [Medsker, 2017] Medsker, L. (2017). Algorithmic Transparency and Accountability – AI Matters.
• [D. Lane et al., 2017] D. Lane, N., Bhattacharya, S., Mathur, A., Georgiev, P., Forlivesi, C., and Kawsar, F. (2017). Squeezing deep learning into
mobile and embedded devices. IEEE Pervasive Computing, 16(3):82–88.
• [Bourzac, 2017] Bourzac, K. (2017). Millimeter-scale computers: Now with deep-learning neural networks on board.
55
References
• [Radu et al., 2016] Radu, V., Lane, N. D., Bhattacharya, S., Mascolo, C., Marina, M. K., and Kawsar, F. (2016). Towards multimodal deep
learning for activity recognition on mobile devices. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and
Ubiquitous Computing: Adjunct.
• [Scioscia et al., 2014] Scioscia, F., Ruta, M., Loseto, G., Gramegna, F., Ieva, S., Pinto, A., and Di Sciascio, E. (2014). A mobile
matchmaker for the ubiquitous semantic web. Int. J. Semant. Web Inf. Syst., 10(4):77–100.
• [Lippi and Torroni, 2016] Lippi, M. and Torroni, P. (2016). Argumentation mining: State of the art and emerging trends. ACM Trans. Internet
Technol., 16(2):10:1–10:25.
• [Keskes and Rahmoun, 2016] Keskes, N. and Rahmoun, A. (2016). Meaning negotiation based on merged individual context ontology
and part of semantic web ontology. International Journal on Human Machine Interaction, 2(1):1–12.
• [Garruzzo and Rosaci, 2007] Garruzzo, S. and Rosaci, D. (2007). Ontology enrichment in multi agent sys- tems through semantic
negotiation. In Proceedings of the International Conference on Cooperative Infor- mation Systems (COOPIS 2007) - LNCS 4803 -
Springer Verlag, Vilamoura, Portugal.
• [Dung, 1995] Dung, P. M. (1995). On the acceptability of arguments and its fundamental role in nonmono- tonic reasoning, logic
programming and n-person games. Artificial Intelligence, 77(2):321 – 357.
• [Prakken, 2010] Prakken, H. (2010). An abstract framework for argumentation with structured arguments. Argument & Computation, 1(2):
93–124.
• [Amgoud and Cayrol, 2002] Amgoud, L. and Cayrol, C. (2002). A reasoning model based on the production of acceptable arguments.
Annals of Mathematics and Artificial Intelligence, 34(1):197–215.
• [Kraus et al., 1998] Kraus, S., Sycara, K., and Evenchik, A. (1998). Reaching agreements through argumen- tation: a logical model and
implementation. Artificial Intelligence, 104(1):1 – 69.
• [Gordon et al., 2007] Gordon, T. F., Prakken, H., and Walton, D. (2007). The carneades model of argument and burden of proof. Artificial
Intelligence, 171(10):875 – 896. Argumentation in Artificial Intelligence.
• [Krause et al., 1995] Krause, P., Ambler, S., Elvang-Goransson, M., and Fox, J. (1995). A logic of argumen- tation for reasoning under
uncertainty. Computational Intelligence, 11(1):113–131.
• [Toulmin, 1969] Toulmin, S. (1969). The Uses of Argument. Cambridge, England: Cambridge University Press
• [Simari and Loui, 1992] Simari, G. R. and Loui, R. P. (1992). A mathematical treatment of defeasible reasoning and its implementation.
Artificial Intelligence, 53(2):125 – 157.
• [Prakken and Sartor, 1996] Prakken, H. and Sartor, G. (1996). A dialectical model of assessing conflicting arguments in legal reasoning.
Artificial Intelligence and Law, 4(3):331–368.
56
Copyright notice
• Material for these slides not originally produced has been
dutifully cited
• i.e., “Source” caption below pictures
• Where not explicitly cited, material not originally produced
has been taken by lectures and tutorials publicly
available, for which consent to re-use has been granted
by authors (private emails available upon request)
• here http://acl2016tutorial.arg.tech/index.php/tutorial-materials/
• and here http://www.informatik.uni-leipzig.de/~brewka/KRlecture/
• All the credits for this, go to the authors
• All the blame for the mistakes, on me :)
The Speaking Objects Vision
Argumentation for Coordinating Distributed Systems
Part II: On Argumentation
Prof. Stefano Mariani
(stefano.mariani@unimore.it)