Moral Cognition and Computational Theory

MORAL PSYCHOLOGY: THE NEUROSCIENCE OF MORALITY, Walter Sinnott-Armstrong, ed., Vol. 3, MIT Press

Georgetown Public Law Research Paper No. 1029511

17 Pages Posted: 13 Nov 2007

See all articles by John Mikhail

John Mikhail

Georgetown University Law Center

Abstract

In his path-breaking work on the foundations of visual perception, the MIT neuroscientist David Marr distinguished three levels at which any information-processing task can be understood and emphasized the first of these: Although algorithms and mechanisms are empirically more accessible, it is the top level, the level of computational theory, which is critically important from an information-processing point of view. The reason for this is that the nature of the computations that underlie perception depends more upon the nature of the computational problems that have to be solved than upon the particular hardware in which their solutions are implemented.

In this comment on Joshua Greene's essay, The Secret Joke of Kant's Soul, I argue that a notable weakness of Greene's approach to moral psychology is its neglect of computational theory. A central problem moral cognition must solve is to recognize (i.e., compute representations of) the deontic status of human acts and omissions. How do people actually do this? What is the theory which explains their practice?

Greene claims that emotional response predicts deontological judgment, but his own explanation of a subset of the simplest and most extensively studied of these judgments - trolley problem intuitions - in terms of a personal/impersonal distinction is neither complete nor descriptively adequate. In a series of influential papers, Greene argues that people rely on three features to distinguish the well-known Bystander and Footbridge problems: whether the action in question (a) could reasonably be expected to lead to serious bodily harm, (b) to a particular person or a member or members of a particular group of people (c) where this harm is not the result of deflecting an existing threat onto a different party. Greene claims to predict trolley intuitions and patterns of brain activity on this basis. However, this explanation is incomplete, because we are not told how people manage to interpret the stimulus in terms of these features; surprisingly, Greene leaves this crucial first step in the perceptual process unanalyzed. Additionally, Greene's account is descriptively inadequate, because it cannot explain even simple counterexamples, let alone countless real-life examples which can be found in any casebook of torts or criminal law. Hence Greene has not shown that emotional response predicts these moral intuitions in any significant sense. Rather, his studies suggest that some perceived deontological violations are associated with strong emotional responses, something few would doubt or deny. Moreover, a better explanation of these intuitions is available, one that grows out of the computational approach to cognitive science that Marr helped to pioneer.

Keywords: Marr, Greene, Haidt, Chomsky, Bentham, moral cognition, moral grammar, computational theory, trolley problem, deontic judgment, deontic logic, act tree, battery

JEL Classification: D63, D64, K00, K13, K14

Suggested Citation

Mikhail, John, Moral Cognition and Computational Theory. MORAL PSYCHOLOGY: THE NEUROSCIENCE OF MORALITY, Walter Sinnott-Armstrong, ed., Vol. 3, MIT Press, Georgetown Public Law Research Paper No. 1029511, Available at SSRN: https://ssrn.com/abstract=1029511

John Mikhail (Contact Author)

Georgetown University Law Center ( email )

600 New Jersey Avenue, NW
Washington, DC 20001
United States
202-662-9392 (Phone)
202-662-9409 (Fax)

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
1,139
Abstract Views
8,492
Rank
34,717
PlumX Metrics