Delivered by Paul Hannon, Executive Director
Thank you Mr. Chairman. As a co-founder of the Campaign to Stop Killer Robots and a long-time advocate for humanitarian disarmament, Mines Action Canada supports the statement delivered by the Campaign’s Coordinator.
In many ways 2017 was a lost year for efforts to prohibit autonomous weapons here so we are hoping to see significant progress at the CCW in 2018.
Outside of these walls though, the conversation about autonomous weapons progressed at the end of 2017 and the start of 2018.
In November, over 200 Canadian Artificial Intelligence experts released an open letter to Prime Minister Justin Trudeau calling for Canadian leadership on autonomous weapons systems. These Canadian experts are still waiting for a response from the government of Canada. Similar national letters have been released in Australia and Belgium.
Two weeks ago the G7 Innovation Ministers released a Statement on Artificial Intelligence which cited the need to increase trust in AI and included a commitment to “continue to encourage research, including […] examining ethical considerations of AI.”.
This week should provide opportunity for states to share and expand on their positions with regards to autonomous weapons systems and the need for meaningful human control. States should not overlook the ethical, humanitarian and human rights concerns about autonomous weapons systems as we delve into some technical topics.
Mr. President, CCW protocols have a history of addressing the ethical and humanitarian concerns about weapons. Protocol IV on blinding laser weapons is particularly relevant to our discussions. As a pre-emptive prohibition on an emerging technology motivated by ethical concerns, Protocol IV has been very effective in preventing the use of an abhorrent weapon without limiting the development of laser technology for other purposes including other military purposes. It is important to note that Protocol IV has some of the widest membership of all the protocols including all five permanent members of the United Nations Security Council, all the states that have chaired the autonomous weapons talks here at the CCW and most of the states who have expressed views about autonomous weapons. All those states are party to a Protocol that banned for ethical reasons a weapon before it was ever deployed in conflict.
Above all, we hope that the states present this week will reflect on the concept of responsibility. The Government of Poland’s working paper which discusses this topic is a useful starting point. We see responsibility as a theme that runs throughout these discussions.
A Canadian godfather of Artificial Intelligence has often spoken of the need to pursue responsible AI. Responsible AI makes life better for society and helps “prevent the misuse of AI applications that could cause harm” as noted in the G7 Annex.
We have been entrusted with a great responsibility here in this room. We have the responsibility to set boundaries and prevent future catastrophes. We must be bold in our actions or we could face a situation where computer programmers become de facto policy makers.
Above all, as part of our collective humanity, we must remain responsible for our actions – we cannot divest control to one of our creations whether it is in our daily actions, or more crucially for this week’s discussion, in our decisions to use weapons.
In the past, those sitting in these seats have met their responsibility to “continue the codification and progressive development of the rules of international law applicable in armed conflict” by negotiating new protocols and in the case of blinding laser weapons a pre-emptive protocol. Now it is our turn and this is our issue to address.
To learn more about what happened at the April 2018 GGE meeting, read these daily recaps: