We shouldn't let killer robots target development

On Day Two of International Development Week, let’s change it up and look at a weapon that is deceitfully futuristic. Autonomous weapons or killer robots aren’t here yet, but their threat to sustainable development is. Killer robots pose a serious threat to countries and individuals around the world, many of whom will not develop the technology to combat these weapons. You may think the impact of autonomous weapons will not be felt unless they are used, but you might be surprised to know that is not the case. UN Sustainable Development Goal #10 is reduced inequalities; the development of autonomous weapons will only increase existing inequalities as well as create new ones. 

Scientists see the potential for robots and AI to help with sustainable development, but if autonomous weapons are created then many people will lose trust in AI which could otherwise be used for helpful purposes. We aren’t advocating for the end of artificial intelligence and robots - we are saying there is a clear line of what robots should and shouldn’t be able to do, and killing people has crossed that line. Technology is helpful, and can help the world in many different ways including by furthering the Sustainable Development Goals. If people lose trust in this technology because they are also being killed by it, we lose a chance to advance development. 

Further, the amount that governments spend on military equipment is already exorbitant. Add killer robots to the arsenal, and the costs are going to be astronomical. In 2022, Canadian Department of National Defense announced 8 billion dollars of military spending over the next 5 years. We don’t want to see this number increase to pay for autonomous weapons when this money can be better spent investing in sustainable development goals. The SDGs have a real and direct impact on people’s lives, including Canadians and especially Indigenous populations in Canada. Developing autonomous weapons would be a distraction from good work that can be done to improve the lives and livelihoods of people across Canada, and the world. Let’s invest our money in improving lives and meeting development goals, and not creating a dangerous weapon. 

Finally, one of the main concerns Stop Killer Robots has with autonomous weapons is the discrimination that will be built into the artificial intelligence system, also known as “algorithmic biases”.  Although artificial intelligence has the ability to act on its own, it has still been programmed and created by humans. Humans have biases, and without a diverse team that takes into account differences, biases are inevitable. There are concerns from the disability community that AI wouldn’t be able to detect their movements correctly and view them as a threat. There are concerns from racialized communities that AI would recognize their darker skin colour as a threat. This is already happening - AI that is being used right now (eg. motion sensors and photo labeling) often do not recognize darker skin. If AI that is already being used on a mass-scale has biases, it’s unlikely that autonomous weapons will be developed without any biases. This brings us to Sustainable Development Goal #10, which strives to have reduced inequalities. If killer robots are programmed to be more likely to target a disabled or darker-skinned person, this will only bring existing inequalities to a new and deadly level thus impeding sustainable development. From an international security perspective, autonomous weapons will increase inequalities between states who have the weapon and states who do not. 

Canada, we shouldn’t let killer robots target sustainable development. Let’s push for a treaty banning autonomous weapon systems!