HAL is a multi-disciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L'archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d'enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.
This paper introduces a multi-agent dynamic epistemic logic for abstract argumentation. Its main motivation is to build a general framework for modelling the dynamics of a debate, which entails reasoning about goals, beliefs, as well as policies of communication and information update by the participants. After locating our proposal and introducing the relevant tools from abstract argumentation, we proceed to build a three-tiered logical approach. At the first level, we use the language of propositional logic to encode states of a multi-agent debate. This language allows to specify which arguments any agent is aware of, as well as their subjective justification status. We then extend our language and semantics to that of epistemic logic, in order to model individuals’ beliefs about the state of the debate, which includes uncertainty about the information available to others. As a third step, we introduce a framework of dynamic epistemic logic and its semantics, which is essentially based on so-called event models with factual change. We provide completeness results for a number of systems and show how existing formalisms for argumentation dynamics and unquantified uncertainty can be reduced to their semantics. The resulting framework allows reasoning about subtle epistemic and argumentative updates—such as the effects of different levels of trust in a source—and more in general about the epistemic dimensions of strategic communication.
Dynamics and uncertainty are essential features of real-life argumentation, and many recent studies have focused on integrating both aspects into Dung’s well-known abstract argumentation frameworks (AFs). This paper proposes a combination of the two lines of research through a well-behaved logical tool: dynamic logic of propositional assignments (DL-PA). Our results show that the main reasoning tasks of virtually all existing formalisms qualitatively representing uncertainty about AFs are encodable in DL-PA. Moreover, the same tool is also useful for capturing dynamic structures, such as control AFs, as well as for developing more refined forms of argumentative communication under uncertainty.
This paper studies the relation between persuasive argumentation and the speaker's epistemic attitude. Dung-style abstract argumentation and dynamic epistemic logic provide the necessary tools to characterize the notion of persuasion. Within abstract argumentation, persuasive argumentation has been previously studied from a gametheoretic perspective. These approaches are blind to the fact that, in reallife situations, the epistemic attitude of the speaker determines which set of arguments will be disclosed by her in the context of a persuasive dialogue. This work is a first step to fill this gap. For this purpose we extend one of the logics of Schwarzentruber et al. with dynamic operators, designed to capture communicative phenomena. A complete axiomatization for the new logic via reduction axioms is provided. Within the new framework, a distinction between actual persuasion and persuasion from the speaker's perspective is made. Finally, we explore the relationship between the two notions.
We study the relation between two existing formalisms: incomplete argumentation frameworks (IAFs) and epistemic logic of visibility (ELV). We show that the set of completions of a given IAF naturally corresponds to a specific equivalence class of possible worlds within the model of visibility. This connection is further strengthened in two directions. First, we show how to reduce argument acceptance problems of IAFs to ELV model-checking problems. Second, we highlight the epistemic assumptions that underlie IAFs by providing a minimal epistemic logic for IAFs.
Argument evaluation , one of the central problems in argumentation theory, consists in studying what makes an argument a good one. This paper proposes a formal approach to argument evaluation from the perspective of justification logic. We adopt a multi-agent setting, accepting the intuitive idea that arguments are always evaluated by someone. Two general restrictions are imposed on our analysis: non-deductive arguments are left out and the goal of argument evaluation is fixed: supporting a given proposition. Methodologically, our approach uses several existing tools borrowed from justification logic, awareness logic, doxastic logic and logics for belief dependence. We start by introducing a basic logic for argument evaluation, where a list of argumentative and doxastic notions can be expressed. Later on, we discuss how to capture the mentioned form of argument evaluation by defining a preference operator in the object language. The intuitive picture behind this definition is that, when assessing a couple of arguments, the agent puts them to a test consisting of several criteria (filters). As a result of this process, a preference relation among the evaluated arguments is established by the agent. After showing that this operator suffers a special form of logical omniscience, called preferential omniscience, we discuss how to define an explicit version of it, more suitable to deal with non-ideal agents. The present work exploits the formal notion of awareness in order to model several informal phenomena: awareness of sentences, availability of arguments and communication between agents and external sources (advisers). We discuss several extensions of the basic logic and offer completeness and decidability results for all of them.
The notion of argumentation and the one of belief stand in a problematic relation to one another. On the one hand, argumentation is crucial for belief formation: as the outcome of a process of arguing, an agent might come to (justifiably) believe that something is the case. On the other hand, beliefs are an input for argument evaluation: arguments with believed premisses are to be considered as strictly stronger by the agent to arguments whose premisses are not believed. An awareness epistemic logic that captures qualified versions of both principles was recently proposed in the literature. This paper extends that logic in three different directions. First, we try to improve its conceptual grounds, by depicting its philosophical foundations, critically discussing some of its design choices and exploring further possibilities. Second, we provide a (heretofore missing) completeness theorem for the basic fragment of the logic. Third, we study, using techniques from dynamic epistemic logic, how different forms of information change can be captured in the framework.
We introduce a multi-agent, dynamic extension of abstract argumentation frameworks (AFs), strongly inspired by epistemic logic, where agents have only partial information about the conflicts between arguments. These frameworks can be used to model a variety of situations. For instance, those in which agents have bounded logical resources and therefore fail to spot some of the actual attacks, or those where some arguments are not explicitly and fully stated (enthymematic argumentation). Moreover, we include second-order knowledge and common knowledge of the attack relation in our structures (where the latter accounts for the state of the debate), so as to reason about different kinds of persuasion and about strategic features. This version of multi-agent AFs, as well as their updates with public announcements of attacks (more concretely, the effects of these updates on the acceptability of an argument) can be described using S5-PAL, a well-known dynamic-epistemic logic. We also discuss how to extend our proposal to capture arbitrary higher-order attitudes and uncertainty.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.