AI isn't a dual-use technology, it is inherently violent

AI erases the line between civilian and military life

Ai is inherently violent

When the Pentagon branded Anthropic CEO Dario Amodei “a liar with a god complex” over fears that his company’s AI could be used for weapons and surveillance, it exposed a deeper truth: the boundary between civilian and military technology no longer exists. The same systems that power translation, logistics, and digital assistants can just as easily identify targets or manipulate populations. Thomas Christian Bächle and Jascha Bareis argue that today’s AI is not simply “dual use” — it is inherently violent in design. Adaptive, autonomous, and globally networked, these machines fuse daily life with geopolitics, making peace itself a fading abstraction.

 

Drones have become an uncanny threat—not least in the wake of the cost of human life and the degrees of suffering and destruction they have inflicted in Russia’s war on Ukraine. In many European countries they have been sighted near critical infrastructure or military sites, either used for reconnaissance or sabotage, at times causing major disruptions in civilian air travel. Drones unsettle a population that is fearful and weary of the brutality of war at their doorstep. They have become a major element to what is labelled hybrid warfare, fought beyond the conventional ways of violence.

But this is not the whole picture. For years, drones have also been envisioned as a technology that bears the potential of bringing about major changes for the better: more efficient disaster relief, medical supply chains reaching even the remotest areas, optimized logistics or transportation. Drones also introduced a new visual – bird’s-eye-aesthetic of how to see the world.

This ambiguity is characteristic of any technology. It is never distinct from social processes; technology shapes and is being shaped by them—both sides are inextricably linked with each other. Drones are no different in this regard. What makes them currently stand out, however, is that they are emblematic of what elsewhere we have called the realities of autonomous weapons: they symbolize a complex mix of meaning—a diffuse idea of destructive potential combined with a presumed human-like agency and artificial intelligence (AI). These realities blend existing military technologies with visions of their envisioned future capabilities, and in so doing point to the interplay between fact and fiction, actual developments and creative imagination.

___

Digital technologies are bound—even designed—to defy regulatory and ethical stances.

___

This piece argues that recent technological developments always carry within themselves latent abilities to inflict harm, violence, and aggression. Yes, technology has certainly never been neutral; and, yes, these technologies also bear the potential to serve a benevolent purpose in a society, contributing for example to medicine, research, industry, or education. But still, digital, networked, and AI-enabled technologies bear a particularly harmful and violent potential for at least three reasons.

Want to continue reading?

Get unlimited access to insights from the world's leading thinkers.

Browse our subscription plans and subscribe to read more.

Start Free Trial

Already a subscriber? Log in

Latest Releases
Join the conversation

Brian Balke 6 March 2026

In the case of military conflict, what we should contemplate is that "winning a war" implies goals. If not supplied a coherent statement of goals (such as "why did we attack Iran?"), the AI engine must supply its own. What proponents of this technology should contemplate is that THEY may be identified as part of the problem. In other words, those initiating the application of violence may be eliminated as a precondition for ending the violence.

As for the commercial conflict, I have long been repelled by the practice of the psychopharmacology industry to defend classification of their "medications" as life-saving interventions similar to cancer drugs, allowing people suffering from mental discomfort to be subjected to the most aggressive side-effects. This is coupled with the addictiveness and lack of therapeutic mechanism that prevents the FDA from allowing them to be marketed as "medications."

Similar behavior is seen by those that have poisoned the world with PFAs and insist on subsidies for fossil fuels that drive global climate change. AI is merely the visible bugaboo, but it is human greed that is the universal seed of violence.

Reply