James Haller Moor[1] (1942 – September 11, 2024)[2] was an American ethicist, moral philosopher, and is especially known for his pioneering work in computer ethics. He spent most of his career at Dartmouth College, where he was the Daniel P. Stone Professor of Intellectual and Moral Philosophy.
Education and career
Moor studied mathematics at the Ohio State University, where he obtained a bachelor's degree in 1965. He went on to study philosophy at the University of Chicago, where he obtained a masters' degree. He worked as a teaching fellow at Findlay College before embarking on further studies at Indiana University Bloomington, where he earned his Ph.D. in the philosophy of science in 1972. His thesis titled Computer Consciousness was supervised by Wesley C. Salmon. Moor joined Dartmouth College in the same year as an assistant professor in philosophy. He became an associate professor in 1978 and a professor in 1985.[3] Moor's 1985 paper entitled "What is Computer Ethics?" established him as one of the pioneering theoreticians in the field of computer ethics.[4][5] Since 2009 at Dartmouth, Moor was the Daniel P. Stone Professor of Intellectual and Moral Philosophy, a title he held until his death.[6]
Moor was the editor-in-chief of Minds and Machines (2001-2010), a peer-reviewed academic journal covering artificial intelligence, philosophy, and cognitive science.[7]
Ethical impact agents: machine systems carrying an ethical impact whether intended or not. Moor gives the example of a watch causing a worker to be on work on time. As well as Ethical impact agents there are Unethical impact agents. Certain agents can be unethical impact agents at certain times and ethical impact agents at other times. He gives the example of what he calls a 'Goodman agent', named after philosopher Nelson Goodman. The Goodman agent compares dates "this was generated by programming yearly dates using only the last two digits of the year, which resulted in dates beyond 2000 being misleadingly treated as earlier than those in the late twentieth century. Thus the Goodman agent was an ethical impact agent before 2000, and an unethical impact agent thereafter."
Implicit ethical agents: machines constrained to avoid unethical outcomes.
Explicit ethical agents: Machines which have algorithms to act ethical.
Full ethical agents: Machines that are ethical in the same way humans are (i.e. have free will, consciousness and intentionality)
Moor has criticised Asimov's Three Laws of Robotics saying that if applied thoroughly they would produce unexpected results. He gives the example of a robot roaming the world trying to prevent harm from all humans.
Bynum, Terrell Ward; Moor, James, eds. (2000). The digital phoenix: how computers are changing philosophy (Revised ed.). Oxford: Blackwell. ISBN978-0-631-20352-0.
Moor, James; Bynum, Terrell Ward, eds. (2002). Cyberphilosophy: the intersection of philosophy and computing. Metaphilosophy series in philosophy. Oxford: Blackwell. ISBN978-1-4051-0073-1.