Picture this: you collapse on the couch after a long day, and open Instagram or TikTok. Finally, you can flick through your feed for a few minutes before bed. Before you know it, three hours have gone by and you’re still on your couch, staring at your phone and feeling a little worse about yourself and your life. Now imagine you are 14 and this is what you do each evening.
In what has been hailed as a bellwether case, a Los Angeles jury on March 25 decided against Meta (Facebook’s parent company) and Google (YouTube’s parent company), awarding damages of $3m to a young woman who suffered from a social media addiction since childhood. This comes shortly after a jury in New Mexico returned a $375m verdict against Meta in a lawsuit that alleged the company knew its products were harmful to children.
The verdict matters in South Africa because it uses a legal route that may also be available locally. In the US, the plaintiff’s case was framed not principally as a dispute about speech or content, but as a product-liability claim: the argument was that social-media platforms are products, that they are defective because they are designed to be addictive, and that the resulting addiction causes harm to users, particularly children.
Retrospective regulation
Among experts and parents, concerns about children’s use of social media have been growing for years. Social psychologist Jonathan Haidt spells these out in his bestselling book, The Anxious Generation. He attributes sharp increases in the rates of mental illness among teens in America to social media use. His findings underpinned the Australian government’s decision to ban social media for under-16s – a move that is expected to be replicated in jurisdictions across the world. The conclusions in Haidt’s book also seem to prove what we instinctively feel is true: social media is not good for us or our children.
Public opinion and a growing body of scientific research pose an issue for lawmakers: how are governments to retrospectively regulate the social media giants?
So far, little has worked. Legislative carve-outs mean that, unlike more traditional media, the platforms cannot be held liable for content published (even if it is harmful). Negative press attention also does not seem to stick. This is unsurprising, given the resources available to these companies, coupled with the fact that they own an entire media ecosystem in which they can control the message about the harm their products inflict.
Against this backdrop, the decision by the jury in the matter – KGM vs Meta Inc and others – is all the more surprising.
Addictive on purpose?
The decision is being lauded as a turning point because of the novel product-liability claim. This brought the case within the ambit of consumer-protection legislation.
The evidence presented in support of the argument is as compelling as it is distressing. To highlight a few allegations:
- Social media platforms use the same techniques as slot machines to keep users engaged. The apps manipulate dopamine release through features like the infinite scroll function and the ability to receive “likes” for the content posted.
- The apps purposefully promote negative social comparison, which leave users feeling worse about themselves and their own lives.
- The endless barrage of notifications encourages repetitive account checking.
- The apps exploit social reciprocity to keep users engaged (for example, Instagram notifies a sender when a message has been read. This puts pressure on the receiver to respond as soon as they have read the message. This has the effect of keeping both parties engaged on the app).
- Inadequate age-verification protocols and insufficient parental controls leave children vulnerable to exploitation.
The plaintiff alleged that these practices keep the apps addictive on purpose, and open children up to exploitation and abuse. In the case of Meta, there is evidence that the company knew these features were harmful to children but decided not to address the risks.
Will it work in South Africa?
While both YouTube and Meta have indicated they will appeal the decision, the case has significant implications globally. The reasoning in this matter could snowball and be replicated in all jurisdictions with similar consumer-protection legislation.
In South Africa, a case could be made that child users of social media who have suffered harm are entitled to compensation under the Consumer Protection Act of 2008 (CPA). Section 61 provides that a producer, importer, distributor or retailer of goods may be liable for harm caused wholly or partly by unsafe goods, a product failure, defect or hazard, or inadequate instructions or warnings. This liability applies irrespective of negligence.
The same section states that compensable harm includes death, injury, illness, damage to property and economic loss flowing from that harm. That means South African law already recognises a strict-liability route for damage caused by defective goods.
The relevance to social media lies in the CPA’s definitions. The CPA defines “goods” broadly, expressly including data, software, code and other intangible products encoded on a medium, as well as a licence to use such a product.
It also defines “services” broadly, including the provision of information and access to facilities or benefits. That creates at least a plausible statutory basis for arguing that a downloaded social media app, or access to it, falls within the kinds of products or consumer offerings regulated by the CPA.
The legal question would not be whether the act contains a product-liability section – it plainly does – but whether a court would accept that a social media platform, or its app-based software, is a “good” under section 61.
Not so simple
The act has a relatively wide territorial reach. It applies to transactions and the promotion of goods or services in South Africa, irrespective of whether the supplier resides or has its principal office inside or outside the country. That is important because the major social media platforms are foreign-based companies that market their products and services directly to South African users.
It is also not limited to individual claims. Section 4 allows a person to approach a court, the consumer tribunal or the consumer commission on their own behalf, on behalf of a person who cannot act in their own name, as a member of or in the interest of a group or class of affected persons, or in the public interest with leave. The act further instructs courts and the tribunal to prefer interpretations that best promote the act’s purposes and consumer rights, and it expressly allows consideration of appropriate foreign and international law. That means a local court would be permitted to look at the reasoning developed in foreign litigation, though it wouldn’t be bound by it.
Still, the case wouldn’t be straightforward. A claimant would still have to establish that the platform or app qualifies as “goods” under section 61, or that its provision of services brings it within section 61 through the supply of goods in conjunction with those services.
There would also be questions about whether use of a free platform amounts to a “transaction”, as the statutory definition of a transaction generally turns on supply for consideration. And, as in any product-liability case, a claimant would need to prove causation: that the design features caused legally recognisable harm to a particular user.
Forcing change
The South African position, then, is not that social media addiction claims are already clearly established under local law. It is that South Africa already has consumer-protection and product-liability legislation broad enough to make the argument legally conceivable. The key statutory building blocks exist: a strict-liability provision, broad definitions of goods and services, application to conduct in South Africa even by foreign suppliers, and procedural room for class- or public-interest litigation. Whether that framework can be extended successfully to social media addiction would depend on how a court resolves the classification, consideration and causation questions in a concrete case.
And, while one $3m award is a rounding error to the social media giants, several hundred thousand such claims globally may not be.
The threat of this type of litigation may also force social media companies globally to address these issues in a meaningful way instead of paying lip-service to parental concerns. The addictive nature of the apps must be tempered, and proper age verification protocols must be adopted. The tech companies must address the risks posed by sexual predators on the apps, and eliminate the creation and sharing of child sexual abuse material.
With the advance of AI and the trillions of dollars being spent on it, this is surely not too much to demand from the providers of a dangerous product.
Kat Gersbach is a South African lawyer living and working in Dublin, Ireland. She writes about law, technology and business.
ALSO READ:
- Waiting to exhale: Social media holds its breath
- Why gambling needs the same crackdown as tobacco
- How LinkedIn stole our humour (and self-awareness)
Top image: Rawpixel/Currency collage.
Sign up to Currency’s weekly newsletters to receive your own bulletin of weekday news and weekend treats. Register here.
