Files

Abstract

An error in the operation of an autonomous weapon system (AWS) results in civilians or civilian objects being attacked. In such situations, have civilians or civilian objects been ‘made the object of attack’, such that there is a breach of the rule prohibiting attacks against civilians or civilian objects? This question — which is important because of the high probability of such errors — forms the subject of this article. It argues that the rule prohibiting attacks against civilians or civilian objects requires due diligence — contextually reasonable efforts — across the targeting process, to ensure that civilians or civilian objects are not attacked. This implies that AWS errors breach this rule if the errors are unreasonable, i.e., if they originate in a failure of due diligence at any point in the process of development and deployment of AWS. Moreover, the risk-sensitivity of due diligence obligations suggests that the higher degree of risk involved in the development and use of an AWS leads to a corresponding increase in what constitutes contextually reasonable efforts to ensure that civilians or civilian objects are not attacked.

Details