Porter, Zoe ORCID: https://orcid.org/0000-0002-4467-3288 (2021) Moral responsibility for unforeseen harms caused by autonomous systems. PhD thesis, University of York.
Abstract
Autonomous systems are machines which embody Artificial Intelligence and Machine Learning and which take actions in the world, independently of direct human control. Their deployment raises a pressing question, which I call the 'locus of moral responsibility' question: who, if anyone, is morally responsible for a harm caused directly by an autonomous system? My specific focus is moral responsibility for unforeseen harms.
First, I set up the 'locus of moral responsibility' problem. Unforeseen harms from autonomous systems create a problem for what I call the Standard View, rooted in common sense, that human agents are morally responsible. Unforeseen harms give credence to the main claim of ‘responsibility gap’ arguments – that humans do not meet the control and knowledge conditions of responsibility sufficiently to warrant such an ascription.
Second, I argue a delegation framework offers a powerful route for answering the 'locus of moral responsibility' question. I argue that responsibility as attributability traces to the human principals who made the decision to delegate to the system, notwithstanding a later suspension of control and knowledge. These principals would also be blameworthy if their decision to delegate did not serve a purpose that morally justified the subsequent risk- imposition in the first place. Because I argue that different human principals share moral responsibility, I defend a pluralist Standard View.
Third, I argue that, while today’s autonomous systems do not meet the agential condition for moral responsibility, it is neither conceptually incoherent nor physically impossible that they might. Because I take it to be a contingent and not a necessary truth that human principals exclusively bear moral responsibility, I defend a soft, pluralist Standard View.
Finally, I develop and sharpen my account in response to possible objections, and I explore its wider implications.
Metadata
Supervisors: | Stephen, Holland and Paul, Noordhof |
---|---|
Keywords: | Moral responsibility; autonomous systems |
Awarding institution: | University of York |
Academic Units: | The University of York > Philosophy (York) |
Identification Number/EthosID: | uk.bl.ethos.850002 |
Depositing User: | Zoe Porter |
Date Deposited: | 08 Mar 2022 17:01 |
Last Modified: | 21 Apr 2022 09:53 |
Open Archives Initiative ID (OAI ID): | oai:etheses.whiterose.ac.uk:30233 |
Download
Examined Thesis (PDF)
Filename: Porter_203056268_CorrectedThesisClean.pdf
Licence:
This work is licensed under a Creative Commons Attribution NonCommercial NoDerivatives 4.0 International License
Export
Statistics
You do not need to contact us to get a copy of this thesis. Please use the 'Download' link(s) above to get a copy.
You can contact us about this thesis. If you need to make a general enquiry, please see the Contact us page.