Skip navigation
   
 
Scholarly Communication
Contacts

Agents with a human touch: modeling of human rationality in agent systems

Nawwab, Fahd Saud (2010) Agents with a human touch: modeling of human rationality in agent systems. Doctoral thesis, University of Liverpool.

Full text not available from this repository.

Abstract

Will it be possible to create a self-aware and reasoning entity that has the capacity for decision making similar to that we ascribe to human beings? Modern agent systems, although used today in various applications wherever intelligence is required, are not ready for applications where human rationalities are usually the only option in making important decisions in critical or sensitive situations. This thesis is a contribution to this area: a decision-making methodology is introduced to address the different characteristics that an agent should have in order to be better trusted with such critical decisions. The work begins with a study of philosophy in the literature (Chapter 2), which reveals that trust is based on emotions and faith in performance. The study concludes that a trustworthy decision has five main elements: it considers options and their likely effects; it predicts how the environment and other agents will react to decisions; it accounts for short- and long-term goals through planning; it accounts for uncertainties and working with incomplete information; and, finally, it considers emotional factors and their effects. The first four elements address decision making as a product of "beliefs"; the last addresses it as a product of "emotions". A complete discussion of these elements is provided in Section 2.1. This thesis is divided into two main parts: the first treats trust as a product of beliefs and the second treats trust as a product of emotions. The first part builds the decision-making methodology based on argumentation through a five-step approach where first the problem situation representing the actions available to the agent and their likely consequences is formulated. Next, arguments to perform these actions are constructed by instantiating an argumentation scheme designed to justify actions in terms of the values and goals they promote. These arguments are then subjected to a series of critical questions to identify possible counter arguments so that all the options and their weaknesses have been identified. Preferences are accommodated by organising the resulting arguments into an Argumentation Framework (we use Value-Based Argumentation [VAF] for this approach). Arguments acceptable to the agents will be identified through the ranking of the agent's values, which may differ from agent to agent. In the second part (Chapters 5 and 6), this methodology is extended to account for emotions. Emotions are generated based on whether other agents relevant to the situation support or frustrate the agent's goals and values; the emotional attitude toward the other agents then influences the ranking of the agent's values and, hence, influences the decision. In Chapters 4 and 6, the methodology is illustrated through an example study. This example has been implemented and tested on a software program. The experimental data and some screen shots are also given in the appendix.

Item Type:Thesis (Doctoral)
Uncontrolled Keywords:emotions; decision making; argumentation; trust; multi-agent systems
Subjects:Q Science > QA Mathematics > QA75 Electronic computers. Computer science
B Philosophy. Psychology. Religion > BF Psychology
Departments, Research Centres and Related Units:Academic Faculties, Institutes and Research Centres > Faculty of Science > Department of Computer Science
Refereed:Yes
Status:Unpublished
ID Code:1363
Deposited On:13 Jan 2011 10:10
Last Modified:19 May 2011 13:21

Repository Staff Only: item control page

   
Search


Full text only
Peer reviewed only

Browse
Cross Archive Search
Find
Top 50 authors
Top 50 items
[more statistics]
 
   

These pages are maintained by Library Staff @ University of Liverpool Library

 

All pages © The University of Liverpool, 2004 | Disclaimer | Accessibility | Staff | Students