Technology is often seen as the engine of social change. But this ignores the cultural forces and changes that enable technological shifts, as well as the fact that technology is often used to preserve the status quo, rather than usher in change, argues Lelia Green.
We have just experienced one of the most significant social upheavals in living memory. The COVID-19 pandemic made us review everything we took for granted: personal freedoms, community engagement, work, leisure, shopping, travel. It struck at the heart of consumer society, and it demanded to be taken seriously. And we changed: quickly, and dramatically.
In the rich countries of the global north and its wealthy, educated, privileged outcrops, we have the luxury of looking back on the worst of the pandemic. There’s a sense that normal, pre-COVID, life is resuming, but the truth is we all experienced a huge cultural shift. Our societies changed, culturally, with unexpected speed. Some would argue that it was technology that enabled the cultural shift that COVID required, as well as our slow return to “normal life”. From the means of detection and diagnosis of the disease, through to treatment options and onto vaccines, the north’s technological advantages and its capacity for production at scale, were crucial. Digital connectivity outpaced lightning’s speed in creating new ways for people to connect in virtual groups; contactless shopping became a thing and didn’t blunt the desire to consume. But is technology really the driver that makes it all happen? With over thirty years as a student of the interplay of technology and society to draw on, I say: ‘no’. Technology is only ever a second-order factor; culture is always the key to change.
___
Making a technologically determinist statement, such as ‘Computers have changed the world’ misses the point that it was cultural forces at work in the world that resulted in computers.
___
SUGGESTED READING
AI won’t steal your job, just make it meaningless
By John Danaher
Back in the 1980s, like other researchers in those days, I had access to what we thought of as a state-of-the-art personal computer. The story of the PC can be seen as the paradigmatic case of a piece of technology changing culture forever. Alan Michael Sugar’s electronics trading company, Amstrad, which he set up in 1968, is usually pointed to as a key part of the story of how the computer became a consumer good. This narrative ties in with the myth of the brilliant visionary who changed the world by imagining an industrial machine as having domestic use. But beginning the story at the domestication of computing sidesteps the entire cultural zeitgeist that underpinned the information revolution. It ignores the shift in social and economic priorities which accompanied the huge investments in technology during and after the Second World War.
The political concerns of the late 40s and 50s directly funded a cold war arsenal and a space race between the USA and Russia. Together, these powers had defeated Nazi Germany and Imperial Japan, but in doing so they had become suspicious of each other. Their cultural anxieties gave rise to an unprecedented scale of technological investment. With a growing experience of computing, and with knowledge of game-changing wartime interventions such as Alan Turing’s leadership in cracking the ENIGMA code, the US Department of Defense supported experiments to get huge, stand-alone computers, ‘talking’ to each other.
Join the conversation