AI isn't a dual-use technology, it is inherently violent

AI erases the line between civilian and military life

Ai is inherently violent

When the Pentagon branded Anthropic CEO Dario Amodei “a liar with a god complex” over fears that his company’s AI could be used for weapons and surveillance, it exposed a deeper truth: the boundary between civilian and military technology no longer exists. The same systems that power translation, logistics, and digital assistants can just as easily identify targets or manipulate populations. Thomas Christian Bächle and Jascha Bareis argue that today’s AI is not simply “dual use” — it is inherently violent in design. Adaptive, autonomous, and globally networked, these machines fuse daily life with geopolitics, making peace itself a fading abstraction.

 

Drones have become an uncanny threat—not least in the wake of the cost of human life and the degrees of suffering and destruction they have inflicted in Russia’s war on Ukraine. In many European countries they have been sighted near critical infrastructure or military sites, either used for reconnaissance or sabotage, at times causing major disruptions in civilian air travel. Drones unsettle a population that is fearful and weary of the brutality of war at their doorstep. They have become a major element to what is labelled hybrid warfare, fought beyond the conventional ways of violence.

But this is not the whole picture. For years, drones have also been envisioned as a technology that bears the potential of bringing about major changes for the better: more efficient disaster relief, medical supply chains reaching even the remotest areas, optimized logistics or transportation. Drones also introduced a new visual – bird’s-eye-aesthetic of how to see the world.

This ambiguity is characteristic of any technology. It is never distinct from social processes; technology shapes and is being shaped by them—both sides are inextricably linked with each other. Drones are no different in this regard. What makes them currently stand out, however, is that they are emblematic of what elsewhere we have called the realities of autonomous weapons: they symbolize a complex mix of meaning—a diffuse idea of destructive potential combined with a presumed human-like agency and artificial intelligence (AI). These realities blend existing military technologies with visions of their envisioned future capabilities, and in so doing point to the interplay between fact and fiction, actual developments and creative imagination.

___

Digital technologies are bound—even designed—to defy regulatory and ethical stances.

___

This piece argues that recent technological developments always carry within themselves latent abilities to inflict harm, violence, and aggression. Yes, technology has certainly never been neutral; and, yes, these technologies also bear the potential to serve a benevolent purpose in a society, contributing for example to medicine, research, industry, or education. But still, digital, networked, and AI-enabled technologies bear a particularly harmful and violent potential for at least three reasons.

Want to continue reading?

Get unlimited access to insights from the world's leading thinkers.

Browse our subscription plans and subscribe to read more.

Start Free Trial

Already a subscriber? Log in

Latest Releases
Join the conversation