War, AI/ML & Big data
I have contributed to the following snippit of text, which was part of longer read that was published in BSC Earth Science Equity Newsletter in March 2024 and is also available here.
War and AI
The role of science and new technologies should also be considered when discussing the impacts of war on society and on social inequity. In particular, AI (Artificial Intelligence) is increasingly being developed and adapted for military purposes, where its capabilities are being harnessed for autonomous targeting and engagement systems.
These developments raise significant ethical, legal, and security concerns, primarily because these systems can make lethal decisions without human intervention [15, 16]. The issue is further magnified by the fact that while these autonomous systems are programmed to follow rules strictly, they cannot make moral judgments, even when the laws of war themselves might be ethically questionable.
Such autonomous systems are already being developed and tested in military applications, transforming modern warfare [17]. Examples include robots with automatic machine guns featuring target recognition and engagement capabilities [18], autonomous UAV drones, robot dogs carrying machine guns and explosives [19], AI-controlled machine guns used in assassinations [20], and Israel’s deployment of AI-based rapid target identification systems in the Gaza Strip [21, 22].
The ethical and legal complexities of AI in warfare demand a robust international framework focused on accountability, meaningful human oversight, and responsible development of autonomous weapons systems (AWS) [23]. The scientific community must play an active role in these discussions, as researchers and developers are essential contributors and enablers of this expanding application of AI. Many researchers at the forefront of these advancements are concerned and actively sharing their worries [24, 25].
Our expertise is vital to ensure these weapons are developed with clear ethical guidelines and rigorous safeguards.Through international collaboration among scientists, policymakers, and ethicists, and with adequate vigilance, we can ensure AI technologies are deployed responsibly and ethically, balancing innovation with global peace and security imperatives.
Recommended reads:
endered impacts of armed conflict and implications for the application of international humanitarian law (https://blogs.icrc.org/law-and-policy/2022/06/30/gendered-impacts-of-armed-conflict-and-implications-for-the-application-of-ihl/)
Sex and drone strikes: gender and identity in targeting and casualty analysis (https://www.reachingcriticalwill.org/images/documents/Publications/sex-and-drone-strikes.pdf)
Stop Killler Robot (https://www.stopkillerrobots.org/)
Israel’s Killer AIs (https://stopkiller.ai/)
The Era of Killer Robots Is Here (https://www.nytimes.com/2024/07/09/podcasts/the-daily/the-era-of-killer-robots-is-here.html)
Science and war (1983) (https://documents.uow.edu.au/~/bmartin/pubs/83Birch.html)
Science and war (2014) (https://www.theguardian.com/science/life-and-physics/2014/jul/26/science-and-war)
References:
[15] Christie, E.H., Ertan, A., Adomaitis, L. et al. (2024) Regulating lethal autonomous weapon systems: exploring the challenges of explainability and traceability, AI Ethics 4, 229–245. https://doi.org/10.1007/s43681-023-00261-0
[16] Reichberg, G.M., Syse, H. (2021). Applying AI on the Battlefield: The Ethical Debates. In: von Braun, J., S. Archer, M., Reichberg, G.M., Sánchez Sorondo, M. (eds) Robotics, AI, and Humanity. Springer, Cham. https://doi.org/10.1007/978-3-030-54173-6_12
[17] Ashby, H. “From Gaza to Ukraine, AI is Transforming War”, Inkstick, March 6th, 2024 https://inkstickmedia.com/from-gaza-to-ukraine-ai-is-transforming-war/
[18] Kumagai, J. A Robotic Sentry For Korea’s Demilitarized Zone, IEEE Spectrum, March 1st 2007
https://spectrum.ieee.org/a-robotic-sentry-for-koreas-demilitarized-zone
[19] Vincent, J. They’re putting guns on robot dogs now, The Verge, October 14th 2021
https://www.theverge.com/2021/10/14/22726111/robot-dogs-with-guns-sword-international-ghost-robotics
[20] Mohsen F. ‘Machine-gun with AI’ used to kill Iran scientist, BBC, December 7th 2020 https://www.bbc.com/news/world-middle-east-55214359
[21] Davies, H., McKernan, B. and Sabbagh, D. ‘The Gospel’: how Israel uses AI to select bombing targets in Gaza, The Guardian, December 1st, 2023 https://www.theguardian.com/world/2023/dec/01/the-gospel-how-israel-uses-ai-to-select-bombing-targets
[22] Abraham Y. ‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza, +972 Magazine, April 3rd, 2024 https://www.972mag.com/lavender-ai-israeli-army-gaza/
[23] Adam, D. (2024) Lethal AI weapons are here: how can we control them?, Nature 629, 521-523, doi: https://doi.org/10.1038/d41586-024-01029-0
[24] We work for Google. Our employer shouldn’t be in the business of war: Open letter signed by Google employees, The Guardian, April 5th, 2018 https://www.theguardian.com/commentisfree/2018/apr/04/google-ceo-drones-ai-war-surveillance
[25] Roose, K., The Shift: OpenAI Insiders Warn Of a ‘Reckless’ Race for Dominance, The New York Times, June 5th, 2024 https://www.nytimes.com/2024/06/04/technology/openai-culture-whistleblowers.html