Mines Action Canada’s Opening Statement at Convention on Conventional Weapons

Mines Action Canada delivered an opening statement at the Convention on Conventional this afternoon.  The text of the statement is available online here.

CCW opening statement

Opening Statement – Convention on Conventional Weapons 13 April 2015

Thank you Mr.Chair.  I appreciate the opportunity to speak on behalf of Mines Action Canada.  Mines Action Canada is a Canadian disarmament organization that has been working to reduce the impact of indiscriminate weapons for over twenty years.  For years we have worked with partners around the world including here at the CCW to respond to the global crisis caused by landmines and cluster munitions.  We have seen that the international community can come together to respond to a humanitarian catastrophe and can create international humanitarian law to protect civilians often after the fact due to the changing nature of conflict and technological advances.   However, we are here today in an attempt to look forward. We are looking at future weapons that will require new legal instruments to prevent future catastrophes.  Throughout this week I hope we will keep my grandmother’s advice in mind:  an ounce of prevention is worth a pound of cure.

As a co-founder of the Campaign to Stop Killer Robots, Mines Action Canada is very conscious of public opinion concerning autonomous weapons systems.  Since last year’s discussions here at the CCW, opposition to autonomous weapons systems has grown in Canada.  In addition to our member organizations, academics, parliamentarians, industry, faith communities and members of the general public have expressed concern about the potential humanitarian impacts of autonomous weapons systems.  The widespread opposition to this technology indicates that there may be negative consequences for robotics more generally should autonomous weapons systems be used in armed conflict or in other circumstances.  The erosion of public trust in robotic systems and autonomy as a result of the use of autonomous weapons systems could severely limit our ability to harness the good that robotics could do for humanity.

In addition to these concerns about the impact on public trust in robotics, we have numerous legal, moral, ethical, technical, military, political and humanitarian concerns about autonomous weapons systems which have led to the conclusion that new international humanitarian law is needed to ensure meaningful human control over these and other weapons.   There is a moral imperative to consider the long term effects of the development and deployment of autonomous weapons systems on human society. Proponents of these technologies cite possible battlefield benefits and yet a discussion only dealing with short term or battlefield effects is not enough.  We must ask the difficult questions: is it acceptable to cede decisions over life and death in conflict to machines? Who would be accountable for autonomous weapons systems?  How can IHL adapt when new technology blurs the line between combatant and weapon?

IHL has demonstrated an ability to adapt and evolve to prevent the development and deployment of new and unnecessarily harmful technology.  CCW Protocol IV banning blinding laser weapons is a good example which demonstrates that not only is there a need to add to IHL to address new technology, but also that we can prevent the development and use of weapons before their unacceptable humanitarian consequences create a catastrophe.  We have published a memo to delegates which further explores the lessons learned from Protocol IV.

Autonomous weapons systems are not your average new weapon; they have the potential to fundamentally alter the nature of conflict.  As a “game-changer” autonomous weapons deserve a serious and in-depth discussion.  We hope that this week will see attempts to define meaningful human control and will foster a strong desire to pursue discussions towards a new legal instrument that places meaningful human control at the centre all decisions to use violent force.

Thank you.