Debate about online harms has tended to focus on abusive and hateful content. But the form in which content is delivered is at least as important. That point is central to this week’s momentous decisions against Meta and YouTube, by two US juries. It will take more than these cases to loosen big tech’s tight grip on much of the world’s attention. But the fact that both companies were found liable in California, for deliberately designing addictive products that harmed a child, is a massive win for the coalition of campaigners aiming to use the US courts to force the platforms to change their products.
The second case against Meta, in New Mexico, found it liable over the use of Facebook and Instagram for child sex trafficking, with a Guardian investigation cited in the complaint. The jury ordered it to pay $375m in civil liabilities; the state’s attorney general is seeking platform changes and financial penalties.
Both verdicts are expected to be appealed. But the acceptance by juries of evidence about the damage caused by these businesses, much of it derived from internal documents, reveals shifting attitudes. Documents exposing executives’ shockingly cavalier approach to young people’s safety are now in the public square, and will help the industry’s critics in future. One email from a Meta employee said “targetting [sic] 11 year olds feels like tobacco companies a couple decades ago”.
It is too soon to declare the current push towards stronger regulation to be something like the reckoning faced by big tobacco in the 1990s. If there is to be a meaningful pushback from governments and civil society against the tech companies’ colossal sway over our lives, it is still in its earliest stages. One danger is that the pace of digital innovation outstrips the capacity of legislators to keep up – an especially chilling thought given the AI revolution that is under way. Democratic deliberation takes time. Digital capitalists don’t wait.
What has become increasingly clear is that a precautionary approach to children ought to have been adopted. Young minds are malleable and the attention economy’s assault on them has been unforgivably cynical.
Fortunately, and as governments and now courts are showing, it is not too late. Social media companies can be forced to take responsibility for their impact on public health, specifically mental wellbeing and relationships. In Australia, they have been told to leave children alone. In the UK, the government has issued belated guidance on screen time, and is considering restricting children’s use.
Design features such as infinite scroll and gaming-type rewards have attracted less attention than disturbing and damaging content, but they are why so many of us find it hard to put down our devices – and why vulnerable young people can get caught in toxic spirals.
Social media companies are not the first businesses to compete for human attention. For billions of people, their tools have become valued necessities. But as Cory Doctorow argues in a recent book, the degree of control exercised by the biggest platforms is unprecedented. It will take a whole society approach to reduce our dependence on them, and to work out what safeguards are needed by adults as well as children. The events of the past week make that a bit more likely.
-
Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.

5 hours ago
7

















































