Today the first half of the 2023 Convention on Conventional Weapons (CCW) meeting on autonomous weapons ended. Once again, little to no progress was made towards a new legally-binding instrument on autonomous weapons. Despite growing international momentum for a treaty and excellent statements from some states this week, we are not hopeful those statements will translate into action because the structure of the CCW allows for just one state to block all progress. We need to get out of this stalemate, and make real progress towards a treaty. And you can help!
While we cannot do much about the structure of the CCW, we can call on Canada to lead negotiations under the United Nations General Assembly and for that reason….
We are launching a new autonomous weapons petition to Parliament!
The new petition calls upon the Government of Canada to:
- "Prohibit the domestic development, importation and use of autonomous weapon systems that do not allow for meaningful human control.
- Develop national regulations so that other autonomous weapon systems will be used only with meaningful human control.
- Take an active leadership role in international negotiations to prohibit autonomous weapons systems through new international law under the auspices of the United Nations General Assembly or another inclusive multilateral forum”
If you want our government to take action against killer robots, print out the attached paper petition and encourage 25 people to sign. You can ask your friends, family, co-workers, neighbours - anyone who doesn’t want to see a world where autonomous weapons exist.
Once you have 25 signatures, you can mail the petition to your Member of Parliament for them to present in the House of Commons. If you do not know who your MP is, you can find their name and phone number here. It’s always a good idea to call their constituency office to ask if they will support the petition. If they won’t or you do not want to call them, that’s ok. Elizabeth May, MP for Saanich-Gulf Island, is sponsoring this petition and will present any submissions in the House of Commons.
Mailing letters to MPs is free, so it requires no stamp. Just put the MP’s name and the following address on your envelope and drop the letter in your local mailbox.
House of Commons
Petitions are a great way for supporters, like you, to get involved with our work on killer robots. By signing and sharing a petition, you are making your voice heard to decision-makers. Members of Parliament want to know what their constituents care about, and petitions are an excellent way of showing them that Canadians care about banning autonomous weapons!
We are doing this as a paper petition and not an e-petition because e-petitions are only presented once to parliament, while paper petitions get presented every time at least 25 people sign it.
Get involved with Mines Action Canada by sharing this petition and collecting 25 signatures. Together, we can show the Government of Canada that Canadians are saying NO to autonomous weapons.
Don’t think you can get 25 signatures? That’s ok, get as many as you can and send them in to MP Elizabeth May. Her office can combine your signatures with another petition to make sure your voice is heard!
On Day Two of International Development Week, let’s change it up and look at a weapon that is deceitfully futuristic. Autonomous weapons or killer robots aren’t here yet, but their threat to sustainable development is. Killer robots pose a serious threat to countries and individuals around the world, many of whom will not develop the technology to combat these weapons. You may think the impact of autonomous weapons will not be felt unless they are used, but you might be surprised to know that is not the case. UN Sustainable Development Goal #10 is reduced inequalities; the development of autonomous weapons will only increase existing inequalities as well as create new ones.
Scientists see the potential for robots and AI to help with sustainable development, but if autonomous weapons are created then many people will lose trust in AI which could otherwise be used for helpful purposes. We aren’t advocating for the end of artificial intelligence and robots - we are saying there is a clear line of what robots should and shouldn’t be able to do, and killing people has crossed that line. Technology is helpful, and can help the world in many different ways including by furthering the Sustainable Development Goals. If people lose trust in this technology because they are also being killed by it, we lose a chance to advance development.
Further, the amount that governments spend on military equipment is already exorbitant. Add killer robots to the arsenal, and the costs are going to be astronomical. In 2022, Canadian Department of National Defense announced 8 billion dollars of military spending over the next 5 years. We don’t want to see this number increase to pay for autonomous weapons when this money can be better spent investing in sustainable development goals. The SDGs have a real and direct impact on people’s lives, including Canadians and especially Indigenous populations in Canada. Developing autonomous weapons would be a distraction from good work that can be done to improve the lives and livelihoods of people across Canada, and the world. Let’s invest our money in improving lives and meeting development goals, and not creating a dangerous weapon.
Finally, one of the main concerns Stop Killer Robots has with autonomous weapons is the discrimination that will be built into the artificial intelligence system, also known as “algorithmic biases”. Although artificial intelligence has the ability to act on its own, it has still been programmed and created by humans. Humans have biases, and without a diverse team that takes into account differences, biases are inevitable. There are concerns from the disability community that AI wouldn’t be able to detect their movements correctly and view them as a threat. There are concerns from racialized communities that AI would recognize their darker skin colour as a threat. This is already happening - AI that is being used right now (eg. motion sensors and photo labeling) often do not recognize darker skin. If AI that is already being used on a mass-scale has biases, it’s unlikely that autonomous weapons will be developed without any biases. This brings us to Sustainable Development Goal #10, which strives to have reduced inequalities. If killer robots are programmed to be more likely to target a disabled or darker-skinned person, this will only bring existing inequalities to a new and deadly level thus impeding sustainable development. From an international security perspective, autonomous weapons will increase inequalities between states who have the weapon and states who do not.
Canada, we shouldn’t let killer robots target sustainable development. Let’s push for a treaty banning autonomous weapon systems!
En savoir plus sur les armes autonomes et Campaign to Stop Killer Robots.
- Robots tueurs- Questions et réponses
- Un précédent en matière de prévention – Canada
- Un précédent en matière de prévention – International
Pendant des années, les « robots tueurs » ont constitué une figure récurrente dans le monde du divertissement et de la fiction. Au cours de la dernière décennie, l’idée que des armes entièrement autonomes puissent exister s’est rapprochée de la réalité. En effet, nous avons pu observer dernièrement une augmentation importante de la présence d’armes ne nécessitant pas de contrôle humain, lesquelles ont changé considérablement le visage de la guerre. Les nouvelles technologies permettent de sérieuses avancées dans le développement d’armes entièrement autonomes. Les armes robotisées seraient capables de choisir et de faire feu sur leurs cibles par elles-mêmes, sans aucune intervention humaine. Cette capacité posera un défi fondamental en termes de protection des civils d’une part, et de conformité avec le droit humain et le droit humanitaire international d’autre part. Par ailleurs, il importe de mettre au clair que les armes entièrement autonomes ne sont pas des drones. En effet, les drones sont pilotés par un être humain à distance, alors que les armes entièrement autonomes ne le sont pas, et leur portée va donc bien au-delà des drones armés.
Malgré que la question du moment où les armes entièrement autonomes seront disponibles fait toujours l’objet d’un débat, nous savons que certaines puissances militaires très avancées, incluant la Chine, Israël, la Russie, le Royaume-Uni et les États-Unis, se tournent de plus en plus vers des technologies permettant de donner davantage d’autonomie aux machines en situation de combat. Si certains États déploient des armes entièrement autonomes, d’autres pourraient se sentir obligés de maintenir ou d’ajuster leur arsenal à ce niveau, ce qui risque de mener à une course aux armements robotisés. Il est nécessaire de prendre des mesures maintenant afin d’établir un contrôle sur ces armes avant que les investissements, le progrès technologique et une nouvelle doctrine militaire rendent le cours des choses difficile à changer.
Permettre à des machines de prendre des décisions ayant un impact sur la vie ou la mort d’individus outrepasse une ligne morale fondamentale. Des robots autonomes ne disposeraient pas du jugement et de l’habileté nécessaires pour comprendre le contexte dans lequel ils évoluent. Ces qualités sont essentielles pour faire des choix éthiques complexes sur un champ de bataille en mouvement, pour faire la différence entre des soldats et des civils, et pour évaluer la proportionnalité de l’attaque. En conséquence, les armes entièrement autonomes ne rempliraient pas les exigences du droit de la guerre.
Remplacer des troupes humaines par des machines pourrait également rendre la décision d’aller en guerre plus facile, alourdissant ainsi le fardeau que représente un conflit pour les civils. Par ailleurs, l’utilisation d’armes entièrement autonomes créerait un vide juridique en termes de responsabilité, puisqu’il n’existe pas de règles claires à propos de qui porterait la responsabilité des actions posées par le robot : s’agirait-il du commandant, du programmeur, du manufacturier, ou du robot lui-même ? Sans responsable clair, tous ces acteurs auront moins d’incitatifs pour s’assurer que les robots ne posent pas de menace pour les civils, et les victimes pourront difficilement avoir la satisfaction de voir le responsable des préjudices qui leur auront été causés puni.
The Convention on Conventional Weapons is meeting this week for its 6th Review Conference at the United Nations in Geneva.
This meeting happens every five years and offers states the opportunity to assess progress made under this treaty and to set plans for the next five years.
Today, MAC's Military Advisor delivered our general statement at the Review Conference of the Convention on Conventional Weapons commenting on autonomous weapons, incendiary weapons and the protocols on landmines and explosive remnants of war.
Building on our 2019 statement, MAC asked states if they will take a direct route towards peace and disarmament or will they continue to aimlessly wander through the diplomatic woods?
Read the full statement here.
By Tara Osler
There are countless issues facing youth worldwide today. We are on the forefront of several causes – climate change, gun violence, women’s rights, racial equality, freedom of expression, and economic issues all have made headlines recently as young people took to the streets in droves as advocates for their futures. My generation has the ability to plan and instigate mass demonstrations (due to our access to internet communication networks globally), and we have put our resources to good use. Even during the pandemic, youth demonstrations have continued in several countries, both in-person and over social media. Our access to communications technology and higher education is a privilege no previous generation has benefited from, and now more than ever young people are using it to organize and make their voices heard.
However, being the most technologically-advanced generation is a double-edged sword. Our personal data is permanently available on the internet. For many of us, this started in childhood, long before there were any legal regulations restricting the use of data on the internet. Today, our virtual vulnerability is beginning to inform the issues most important to youth. There is still a relative lack of legal protections that could keep our personal data safe, meaning we are at risk of all manner of technological threats. This year, activists in several countries were tracked using social media data and facial recognition software after attending protests. It is a very dystopian reality: our faces do not belong to us, and can be used against us at any time. I still remember going to Fridays for the Future marches in 2019 – not carrying identification or money, wearing a hat and large sunglasses. As a white woman living in Vancouver, Canada, I am certainly not at the same level of risk as my fellow protestors in the United States or Hong Kong – but the looming of threat of data still hangs over me. I (along with all other young activists) live with the reality that I cannot be anonymous – a fact that makes many of us hesitant to speak freely about causes that matter to us. This vulnerability has also become uncomfortably apparent in light of recent developments in weapons manufacturing.
The production of fully autonomous weapons systems (or “killer robots”) is fundamentally an issue of human rights in a digital age. The potential hazards of these systems are beyond comprehension – and beyond the scope of international law. For my generation, who have lived our entire lives with the internet, this is a serious issue, the severity of which we are only beginning to grasp.
In December of 2020, youth activists from 20 countries gathered for a virtual conference, to share their individual concerns regarding the development of killer robots. As the youth representative for Canada, I logged on to my laptop at 2:00am to join the panel, which encompassed 10 different time zones (mine being the biggest time difference). Though we suffered, at times, through internet connectivity issues and the general inferno that is the Zoom webinar platform, we persevered. I believe we all understood it was the only way – though the impact of a virtual conference may be lesser than that of a physical event, we felt a certain sense of duty to at least try.
Several risks were raised – the representative from Argentina shared her fear that Indigenous communities would be oppressed further by governments and private corporations armed with killer robots. The representative from Vietnam expressed concern over the perilous situation facing his country in the South China Sea, and the danger they would face should other militaries in the region implement these weapons. Another concern, raised by the representative from the United States, was focused on the financial cost of producing the weapons in his country, where rates of unemployment and poverty are climbing. Several of the concerns raised by my fellow speakers reflect other issues that matter to our generation: the rights of minority communities (who will be at great risk if their oppressor gains access to these weapons), freedom of expression (concerns have been raised over the safety of activists should their governments implement killer robots).
One of the concerns raised was the central point in my own speech: the issue of technological advancement and legality. Currently, international law does not have any precedent that could address the legality of killer robots. Technological development worldwide continues to be somewhat of a legal “no man’s land”, in that much of it has occurred without any restrictions to limit its scope. For killer robots, this means there is nothing regulating their development or implementation as of yet. Combined with the lack of legal regulations over the use of personal data, the risk becomes clearer. Killer robots will mostly require artificial intelligence to function, and artificial intelligence requires data to “think”. For those of us whose personal data has been available on the internet since we were children, this is a massive risk to our safety and freedom.
The most common thread among the speeches from my fellow activists was the act of calling upon our governments to take the threat of killer robots seriously. We feel that, like many issues youth are facing today, it is pushed aside in favour of more immediate concerns. Like climate change, the dangers of killer robots seem far-off to many people – they will happen someday, but that day is part of a distant, imagined future that does not concern them. For my generation, that future does not feel so distant. If killer robots are implemented as far in the future as 2040, I’ll be forty-one years old – young enough to be middle-aged, and spend the last half of my life in a world with autonomous weapons. Like climate change, the existence of killer robots is closer than we think, and young people like me know that it will affect us.
Attending the youth conference gave me a strange sense of optimism. For many of us, 2020 was a difficult year. I felt the weight of it on my shoulders as I sat in my desk chair in the middle of night to sign into Zoom, knowing that had things been different I might have been able to travel and see my fellow speakers in person. Yet somehow, I felt lighter when I finally closed my laptop screen at 6:00am. I think it was the lightness that comes from being among likeminded people – from not being alone. I was in a (virtual) room full of people my age who know about an issue that I am passionate about, and who are just as passionate as I am. The lack of human connection that comes with virtual events was a barrier, but I think we broke through it. To me, the youth conference felt like the start of something – I don’t know exactly what, but I like to think it will be good.
Tara Osler was Mines Action Canada’s Summer 2020 Research and Communications Assistant and is currently studying at the University of British Columbia.
Delivered to the Convention on Conventional Weapons' Group of Governmental Experts on Lethal Autonomous Weapons Systems by Executive Director, Paul Hannon, from our office in Ottawa via the online platform Interprefy.
Thank you Chair and thank you to Germany and the ODA for their support in permitting remote participation. I would like to acknowledge that I am speaking to you today from the traditional and unceded territory of the Algonquin people. Recognizing the indigenous nations upon whose land we are working is an excellent reminder of the need to ensure that these discussions are inclusive and grounded in humanity.
As a member of the Campaign to Stop Killer Robots, Mines Action Canada encourages the high contracting parties to ensure that CCW continues to draw on the expertise from civil society and the private sector. Civil society and the private sector have made significant contributions to the discussion since the beginning and High Contracting Parties have frequently commented including today on the importance of contributions from civil society. Our role should be safeguarded in any future work streams to ensure that all these discussions are inclusive.
To keep our conversations grounded in humanity, we recommend adding in a work stream on moral or ethical concerns. A technocratic debate is insufficient to deal with the challenges posed by autonomous weapons systems. We need to be able to answer questions like “how can one test the humanity of an algorithm?” or “what is the relationship between explainablity and ethics?” Explainablity should never be considered as a synonym for ethical.
We are pleased to hear many delegations express a desire to move on from a discussion of definitions and characteristics because we note that CCW Protocol IV does not have a definition of a blinding laser weapon. It prohibits “laser weapons specifically designed, as their sole combat function or as one of their combat functions, to cause permanent blindness to unenhanced vision, that is to the naked eye or to the eye with corrective eyesight devices.” and while there is a definition of permanent blindness in Article 4 there is no definition of laser weapon in the Protocol. That lack of a definition obviously did not prevent negotiations nor stop it from being an effective Protocol.
We appreciate the robust debate this week and would like to direct specific attention to Austria’s comments outlining the needs to show that this GGE is not an isolated diplomatic silo. The work here must reflect the situation outside of CCW where scientists, experts and industry are calling for action, where the public wants to prohibit autonomous weapons, and where political leaders are stepping up. Ambitious guidance at the political level such as the mandate letter for Canada’s Minister of Foreign Affairs which instructs him to “Advance international efforts to ban the development and use of fully autonomous weapons systems” are not being matched by ambition in the CCW.
New international law is needed to address the multitude of concerns with autonomous weapon systems. It is time to negotiate another legally binding instrument, either here or elsewhere. That instrument should include:
- A general obligation to maintain meaningful human control over the use of force;
- Prohibitions on weapons systems that select and engage targets and by their nature pose fundamental moral or legal problems; and
- Specific positive obligations to help ensure that meaningful human control is maintained in the use of all other systems that select and engage targets.
Mines Action Canada appreciates the guiding questions put forward by the chair in his non-paper and we would like to present some additional questions for delegates to consider today and in future meetings:
- Considering what you have heard about data bias, are these conversations inclusive?
- Do our statements reflect the public conscience and political will of our citizens?
- Are we being as ambitious as those inventing new technology?
- What or who is missing from these conversations?
- Will a future generation of diplomats need to negotiate a treaty to protect the rights, lives and livelihoods of civilian victims of autonomous weapons systems because this generation did not seize the chance to negotiate a pre-emptive ban?
We do not want to be the people who let the world sleepwalk into another humanitarian crisis. It is time for ambition and for taking the next step.
Right before the holidays, the Prime Minister's Office published the mandate letters for all the Cabinet Ministers and from Mines Action Canada's perspective there are a couple very interesting items in these letters. With Parliament resuming in less than two weeks, let's dig into the mandate letters and see what we can find.
First is the big news, the mandate letter for Minister of Foreign Affairs, François-Philippe Champagne, includes instruction to "advance international efforts to ban the development and use of fully autonomous weapons systems". You read that right - Canada's Foreign Minister as to help a ban on the development and use of killer robots. That is pretty big news. Canada has been waffling on the issue of autonomous weapons for years now. In diplomatic talks at the United Nations, Canada would occasionally give a statement on the importance of international humanitarian law and the role of weapons reviews in preventing the use of indiscriminate weapons but no one would consider Canada a leader on this issue. Now Canada needs to join the likes of Austria, Chile, and Brazil in not only calling for a ban on autonomous weapons systems but actively working for one. This addition to the mandate letter has definitely been noticed internationally and states will be looking to see a change in Canada's position at the United Nations. We will be watching closely to see how Global Affairs implements this instruction from the mandate letter. We will be looking to see if Minister Champagne is working with his counterparts in National Defense, Innovation, Science, and Industry, Public Safety and Justice to formulate a strategy to bring Canada and the world towards a ban on autonomous weapons systems. Canadian diplomats will need to have the time and resources needed to make this ban a reality but with support and political will it can be done in the next two to three years.
Next up, both Minister Champagne and Minister of National Defence, Harjit Sajjan, have instructions related to the women, peace and security agenda in their mandate letters. This is more great news for our work. Mines Action Canada knows that humanitarian disarmament and the women, peace and security agenda are closely linked. Better implementation of disarmament treaties like the Ottawa Treaty banning landmines protects girls and women in conflict affected areas while better implementation of the women, peace and security agenda increases women's participation in disarmament decision making resulting in better outcomes for us all.
Finally, there is a strong focus on the Sustainable Development Goals and the effectiveness of international assistance in the mandate letter for Minister of International Development, Karina Gould. That is important because there are significant links between the Sustainable Development Goals and disarmament, whether it is nuclear disarmament or clearance of landmines, cluster munitions and explosive remnants of war. Landmine clearance alone is linked to progress on 12 Sustainable Development Goals. The focus on effective international assistance is welcome because we know that supporting mine action (clearance of contaminated land and victim assistance) provides exceptional value for money. Landmines, cluster munitions and explosive remnants of war are lethal barriers to development so support to mine action allows all other development work to happen. When land is cleared and survivors are assisted, communities can safely grow food, refugees and displaced persons can return home and trade can flow smoothly. A recent report showed that for every dollar invested into mine action in Lebanon resulted in an economic benefit of $4.15. If Canada is looking for development projects that promote the Sustainable Development Goals and exemplify effective international assistance, mine action is the way to go. Plus, we would be finishing what Canada started in 1997 with the Ottawa Treaty banning landmines.
Based on these mandate letters, there is a lot of potential for Canada to resume its position as a champion of humanitarian disarmament and help make the world a safer place for us all. Let's hope the Ministers have the courage to see them though.
Mines Action Canada's Program Manager addressed the Convention on Conventional Weapons today.
Statement of Mines Action Canada to CCW Meeting of High Contracting Parties
Thank you Chair. For more than two decades, Mines Action Canada has seen that CCW has the potential to create new international law but too often this body has lacked ambition and allowed a few states to hinder progress.
This week CCW will choose a path to the 2021 Review Conference. States need to ensure that this path is direct and efficient. We do not have the time to continue wandering aimlessly through the diplomatic woods unable to see the forest for the trees. CCW must live up to its potential and undertake real action to protect people from indiscriminate weapons. As this is the only time we will take the floor this week, I would like to speak to three topics.
We echo the calls by Human Rights Watch for high contracting parties to insist on dedicated time to discuss Protocol III in 2020. The ongoing use of incendiary weapons in Syria is abhorrent and must stop. The only possible response to the pain and suffering caused by these weapons is to strengthen the protocol and close the loopholes. We cannot stand idly by any longer.
Similarly on autonomous weapons systems, CCW is in danger of being caught standing still while technology advances in leaps and bounds. States need to focus on action not more discussion.
As we are a member of the Campaign to Stop Killer Robots, we are pleased to see autonomous weapon systems gaining importance at the national level. During the Canadian election this fall, two of the political parties pledge to work for a ban on autonomous weapons in their election platforms including the Liberal party which will form the next government. We note with interest the Swedish foreign minister's recent comments in support of a ban on killer robots. This week the CCW needs to be as ambitious as our politicians. It is time to adopt a new CCW mandate to negotiate a legally binding instrument to prohibit lethal autonomous weapons systems and ensure meaningful human control over the use of force.
Finally, we would like to remind states that there is a robust framework of international law that applies to improvised explosive devices when they fit the definition of mines under the Ottawa Treaty and CCW AP II. Discussion of IEDs must be grounded in the existing international law.
The choices you make here this week will set a course for the next two years. Will you take a direct route towards peace and disarmament or will you continue to aimlessly wander through the diplomatic woods?
By Tyler Bloom
The 21st century is marked by technological advancements – like the internet – that have changed the way we live and have made the world more connected. As a result of these advancements, particularly with the internet, privacy laws have changed to adapt to the new norms in order to ensure that technology does not harm citizens. While previously we have had to create laws to adapt to new technologies, in the case of the development of autonomous weaponry it is important that we pre-emptively create laws to limit the negative impacts associated with these technologies. This requires lawmakers to have an in-depth understanding of the technologies and their implications, otherwise they will not be able to effectively legislate.
“How does [hateful information about me] show up on a 7-year-olds iPhone?” asked Iowa representative Steve King (R) to Google CEO Sundar Pichai during his congressional testimony.
The quote above demonstrates the blatant lack of understanding of technology by a U.S. lawmaker. Representative Steve King, 69, asking the CEO of Google about an iPhone problem, which is a product of a different company, is the epitome of not understanding how technology works. This fundamental lack of understanding of technology is troubling. How can we trust lawmakers to create effective laws to limit the negative impacts of more advanced technology such as artificial intelligence (AI) when they don’t even understand basic technology like their cellphones?
The fact is, we have a role to play too. Modern issues require modern solutions, and it’s up to us to hold lawmakers accountable and ensure that they understand the implications of new technologies.
One such technology that is under development is fully autonomous systems. These systems could be used as weapons of war, for surveillance, policing, and border control, giving the government more control over the lives of citizens. Fully autonomous weapons could be drones and other remote controlled vehicles that are programmed to target certain groups indiscriminately without human intervention. While these systems aren’t in use yet, it is an urgent issue that we create legislation to limit their negative impact. If not, the ramifications of this technology would be widespread. Employing autonomous weapons systems increases the possibility that innocent civilians would be harmed during wartime, as well as increases capability for the government to surveil its population. It is important that we maintain a human perspective while in times of conflict to limit these ramifications.
The Campaign to Stop Killer Robots is an international coalition of more than 100 non-governmental organisations across 54 countries working to pre-emptively ban the use of fully autonomous weapons systems. As this technology is developed, international and national lawmakers must work to create a legal framework to ensure that its applications are limited.
This is where students come in. We have grown up in the age of technology, and normally have a very strong grasp on understanding new technology and the implications of it. We are the students of today, but we’ll also be the lawmakers of tomorrow. Because of this, it makes sense to have us involved in the decision making at this early stage, to ensure that a comprehensive legal framework is implemented for this new technology.
You may be wondering, “How can I get involved?” It’s simple! There are lots of ways that students can effectively get involved in spreading the word about the Campaign to Stop Killer Robots, many of which are enabled by modern technology.
Reaching out to your Member of Parliament (MP)/Representative: Writing letters can be a great way to make your MP aware of an issue that you are passionate about. Students can outline their concerns about this new technology and share them with their MP. A well-written letter or email forces the MP to take a critical look at the issue, and if enough people reach out, then they will realize the importance of the issue and act.
Social Media: In the 21st century, social media has been very effective in spreading messages and pushing through change. Media sites such as Twitter or Facebook are very useful tools in spreading information on the Campaign on a national level, which would help more students and citizens get involved. As a first step, you can follow the Campaign to Stop Killer Robots on Twitter at @BanKillerRobots, on Facebook or Instagram at @StopKillerRobots. You can also use the hashtags #KillerRobots #StudentsAgainstKillerRobots #AutonomousWeapons to get involved in the conversation! You can also follow the Campaign on YouTube!
On-Campus Events: What better way to get students engaged than by offering food to them? Students who wish to get involved can get together and host a bake sale or a pizza party to raise awareness on campus for the Campaign. This can be achieved by creating on-campus Campaign representatives across Canada. These reps would be involved in promoting the Campaign on campus, whether it be through events, flyers, class engagements, or other activities. This would be an excellent method to ensure that hundreds of students have daily exposure to the Campaign.
The Campaign to Stop Killer Robots has been working hard to spread their message and to get support for a ban on fully autonomous weaponry. As students, we are the future of our world and are also well versed in technology, and as such, we should be involved in the decision making. We will make a difference in creating effective policy to ensure that this ever-evolving technology is properly regulated. If you would like more information please do not hesitate to reach out to the Campaign on social media!
Tyler Bloom interned at Mines Action Canada as the Campaign Assistant for the Campaign to Stop Killer Robots between January-April 2019. Tyler is an undergraduate student studying Political Science at the University of Ottawa.