With technology developing faster than anyone can publish research into its harms, and with children spending hours online each day, we are leaving young people completely unprotected from the tech industry. Designed to be addictive and completely unregulated, how much gold-standard evidence do we need before we act? asks Bernadka Dubicka.
Earlier this year I took part in an IAI debate about social media and addiction, speaking as a child psychiatrist. As happened with Covid-19, and previously with the climate debate, the academic arguments rage on while the research lags far behind the growth of technology. In the years it takes to complete a study, the tech industry will have moved on to the next generation of platforms, apps, and young users. The research is out of date before it has barely started.
How long do we wait for sufficient gold-standard evidence of the harms of tech on children before we act? Ten years? Twenty? In the meantime, do we believe that tech is a force for good for children or a cause of harm? That depends on which camp you stand in. And if you have foot in each camp, this is not a dichotomous argument: We want our kids to benefit from technology but have many valid concerns. Recently the WHO used the precautionary principle to advise against the use of screens for babies and minimal use for toddlers. This was on the basis of emerging evidence for harms, and because the potential harm to development outweighed any benefits.
If we do not have gold-standard academic data, where do we start to assess the exponential growth of tech and its impact? How about with the tech execs themselves? What advice do tech execs give to their children? Bill Gates insisted on no smartphones for his kids until they were 14 with limits on screen time; snapchat founder Evan Spiegel limited screen time to 1.5 hours per week (for comparison, UK kids spend about 2 hours a day online, plus another couple of hours watching TV); Steve Jobs had evening screen time bans and no iPad. The list goes on.
As happened with Covid-19, and previously with the climate debate, the academic arguments rage on while the research lags far behind the growth of technology.
They clearly see reasons to limit the use of tech. So, what do they know that we do not? And why can’t they share their reasons? This would be especially valuable for the most vulnerable children who are living in poverty, and whose parents do not have the time or knowledge to monitor their online lives? If using tech is a greater force for good and promotes healthy child development, there would be no need for their draconian measures. Or do you sense a whiff of hypocrisy?
In an area with many unanswered questions, these executives are precisely the ones who could give us some answers. Some of them already have. Jennifer Zhu Scott, a tech company investor, TED speaker, and big data expert, has devised a family contract to teach her children about being online: Nothing is private online, what goes online stays online, their personal data is their most valuable asset, and that many free apps are free because they take and sell personal data. In her words, ‘a cell phone is more than a piece of technology. If used wrongly, it can be a weapon that puts your safety or future reputation at risk’.
We need more people with this knowledge to speak openly so that we can ensure digital rights and safety for our children. Tech companies are international so this cannot be achieved without cooperation between governments. Here is even more hypocrisy. In the UK, as in many other countries, there are legal safeguarding standards for children. But anything goes online, as if it is a parallel universe. Offline, we have 9 o’clock watersheds, cinema age ratings, age restrictions to buy cigarettes, alcohol and porn, no gambling for under 18s, child actor legislation, and the list goes on and on... Online, who’s checking?
Most parents do not have a clue about their child’s online life, especially in the early hours of the morning, when they are scrolling through their likes and following their recommendations which, by giving more of the same, compound whatever messages they receive, whether this is suicide content, extreme dieting, loot boxes or porn. We have laws against assisted suicide, but just a few clicks can show a vulnerable child the best way of taking their own lives.
The UK government has taken the initiative and produced a white paper on online harms, proposing an independent regulator. It is a step in the right direction, but the proposals are limited. They will not touch the big companies, and do nothing to address the issue of persuasive design which is the cornerstone of their business model. Persuasive design aims to get our children to look at more and more content, however pernicious. It is generated by algorithms which are designed to be addictive. For the tech companies, more views equal more money.
Why on earth would they do anything different? Tech is the only big industry that is not regulated. Imagine having no pharmaceutical regulation, companies would be sitting back and watching exponential profits from the sales of addictive drugs, giving no information on their side effects, with no transparency and no accountability. Some argue that this is still the case with pharma companies. Although their regulation may not be perfect, in the UK we have the MHRA, the US has the FDA and other countries have similar organisations to scrutinise new drugs and monitor the industry.
We have laws against assisted suicide, but just a few clicks can show a vulnerable child the best way of taking their own lives.
It certainly looks like tech companies want us to have less knowledge and less control over our content and personal data. Default privacy settings are automatically set to the lowest possible rather than to the highest. If parents want to block adult content or change privacy settings, it is made purposefully difficult. It is similarly difficult to take content down: a child’s digital footprint is there for life. Remember what you used to do as a teenager. Would you would still want the world to know about it?
It took decades to impose any regulation on the tobacco industry, even though the harms were known. Still, tobacco industry regulation does not exist in all countries. Companies are still making profits from young children in low income countries; get them addicted young, ensure the habit of a lifetime and so enduring profits. It is estimated that one young life is worth $10 000 to the tobacco industry. How much is a child’s life worth to the tech companies? We know that children and adolescents are impulsive, less able to self-regulate and are much more vulnerable to instant gratification and reward. This is perfect fodder for any industry based on developing habitual behaviours, especially because we also know that changing habitual behaviours is not easy once established.
This takes me back to beliefs. Of course, there is a myriad of benefits from technology. That is not in dispute. But the oft-cited belief that social media is just ‘the new rock n’ roll’, creating a moral panic as new music did in earlier generations, is fake fact which grossly underestimates the reach of tech into every aspect of our children’s lives. It is shaping young brains and affecting development - now and into the future. Rock n’ roll was music; it did not take personal data without consent, nor did it use that data to shape behaviours, employing artificial intelligence that is developing exponentially.
Let’s hope that reason prevails and that these small, regulatory steps in the UK and elsewhere will lead to a regulatory framework that will give our children both digital rights and safety. We need a tech-literate society with equality of access, but our children need to be in charge of their digital lives and futures, not the tech giants.
Join the conversation