Since their inception, social media platforms have faced considerable controversy regarding their nature, use and accountability, as well as public perception, harm and abuse. March 2026 ended with a piece of news that could completely redefine them from a legal perspective, and perhaps, in the future, a legislative one, because it is not their users who are under scrutiny, but their very creators, designers, CEOs, technicians and philanthropists, who are alleged to have developed them with the intention of causing harm to the most vulnerable and defenceless groups: minors. And this would not happen in Europe, but in America.
Within a few hours on 25 March in Los Angeles, a jury ordered Meta (Facebook, Instagram and WhatsApp) and Google (YouTube) to pay three million dollars in compensation to a 20-year-old woman for anxiety and depression caused by social media addiction. K.G.M., the user’s initials, had started using the two platforms between the ages of 6 and 9, subsequently experiencing mental health issues. Although for the two companies this would be a paltry sum compared to their market capitalisation and annual investments, the symbolic value appears enormous: this is the first significant ruling in an individual case in which a US court recognises that the design of social media can pose a risk to minors, thereby paving the way for new legal liabilities.
The allegations relate to the infinite scroll feature, the recommendation system and the continuous notification mechanisms, which have been identified as tools designed to prolong usage, including among minors.
Whilst the two tech giants have already made it clear that they disagree with the verdict and intend to appeal, a number of commentators and industry experts have, in relation to the use of such platforms by a child of such a young age, shifted the blame onto the parents. Yet the jury itself, faced with the same arguments put forward by the lawyers for the two platforms, remained unconvinced in light of internal documents presented by the girl’s lawyers. These documents reportedly show that both companies monitored the behaviour of young users, exploring ways to make them more active and loyal. Among the most hotly debated moments of the trial was the testimony of Meta’s CEO, Zuckerberg, who was called upon to explain the decision to remove certain temporary restrictions on filters and features considered critical for teenagers. Zuckerberg defended the decision in the name of ‘users’ freedom of expression’.
Whilst the US Congress has not yet managed to pass comprehensive legislation on the matter, at least 20 US states have introduced various regulations governing minors’ use of social media, imposing, amongst other measures, restrictions on smartphone use in schools and mandatory age verification for account creation – measures largely contested by organisations such as NetChoice — an association also supported by companies such as Meta and Google — which consider them to be in breach of constitutional rights to privacy and freedom of expression.
Europeans have set the standard in the regulatory sphere, launching the Digital Services Act (DSA) in 2022, which came into full force in 2024 and indirectly impacts design by imposing obligations to assess and mitigate risks. And last year, the European Parliament tabled a motion proposing a minimum age of 16 for using social media without parental consent.
Meanwhile, several European countries have already taken action – Spain is reportedly following suit by introducing a ban on users under the age of 16, and France has initiated the legislative process for a bill that would prohibit the use of such platforms by children under the age of 15 in the country of Liberty.
What is Italy doing about this? Following the first proposal in 2023 to regulate minors’ use of social media platforms, put forward by the Azione party; then a further initiative launched in 2024 by the Democratic Party and Fratelli d’Italia; and finally the so-called ‘Social Bill’, Bill No. 1136 (Provisions for the protection of minors in the digital sphere) presented last October by Fratelli d’Italia and subsequently blocked due to certain issues involving the Data Protection Authority; A week after the Californian case, the Democratic Party tabled a new bill that reverses liability, providing for a shift in the burden of proof: it will not be the user but the platform itself that must demonstrate it has acted correctly, by proving it has taken all appropriate measures to prevent harm. If it fails to do so, it pays. Furthermore, trade secrets will not be enforceable: a judge will be able to order the platform to hand over the technical documentation of its algorithms.
Last week, Moige (the Italian Parents’ Movement), together with ANFN (the National Association of Large Families) and AGE (the Parents’ Association), alongside experts on digital addiction and children, called on the Government and Parliament to set the minimum age for accessing social media at 16, raising the threshold from the current age of 14 set by the Italian Data Protection Code. The request is based on the GDPR, which already sets 16 as the age for autonomous digital consent, and is in line with decisions already made or announced by Australia, Malaysia, Spain, New Zealand and Indonesia. Furthermore, it calls for the inclusion of “full civil and criminal liability for social media platforms regarding the commission or receipt of actions harmful to the psychological and physical well-being of minors aged between 16 and 18”.
“Meanwhile,” the associations point out, “the first European injunction class action, brought by Moige with the support of Aafe, Anfn and the Forum delle Famiglie, is pending against Meta and TikTok’s social media platforms: the Milan Commercial Court has set the first hearing for 14 May 2026.
The campaign calls on platforms to: verify users’ ages and block access to minors under the age limit; remove infinite scrolling, intrusive notifications and addictive algorithms; and ensure transparent information on the risks to minors’ mental health.
On Wednesday, during a press conference, the President of the European Commission, Ursula von der Leyen, stated that the European age verification app is ready and that there are now “no more excuses”. It will now be up to the Member States to adopt the app.
Among the various technical, administrative, legal and regulatory issues surrounding the case involving Meta and Google – and all similar platforms – such as whether responsibility lies solely with parents and whether the measures infringe on freedom of expression and privacy, the focus of the debate has increasingly shifted towards the harm caused to the victims – minors – thereby shifting the discussion from individual responsibility to systemic responsibility.
That is why, at Startupbusiness, we decided to ask psychiatrist Tonino Cantelmi – currently president of the Italian Institute of Cognitive-Interpersonal Therapy (pictured) – a few questions.
Is there such a thing as algorithmic influence and addiction, and could it be comparable to traditional addictions?
Digital technology strongly stimulates the limbic brain and the nucleus accumbens. This triggers the release of dopamine, pleasure and addiction. I have drafted a pro-veritate expert opinion which forms the basis of a lawsuit brought in Italy by parents’ associations against Meta and TikTok. The first hearing will take place in May. This case marks a turning point in Italy: it represents a realisation that children’s use of technology must be regulated and that we cannot remain indifferent or superficial.
What psychological mechanisms make children more vulnerable to algorithms than adults?
Children’s brains are still developing: brain maturation is not complete until after the age of 20, but childhood is a phase of extreme plasticity and significant vulnerability. Early and pervasive exposure to technology is linked to an imbalance in brain activity, with a predominance of the limbic system and rapid, intense activity. Ultimately, this means predisposing our children’s brains to a massive tendency towards addiction.
In Italy, some people are suggesting that parents should be fined as a solution. Should the responsibility lie solely with the parents? How much responsibility do they actually bear in such cases?
Placing the burden on families is dangerous: we need to regulate children’s access to social media and support families. I would just like to highlight one statistic: access to pornography has broken through the eleven-year-old barrier, and the biggest consumers of pornography are minors. This leads to a phenomenon known as ‘early sexualisation of childhood’, which is linked to many psychopathological issues. Users must be identifiable, and this is achievable. Access to age-appropriate content must be regulated according to age.
The Californian ruling is not the first such case in the US – there are several precedents – but it is the first trial to result in a conviction for ‘social media addiction’, with liability attributed to the design of the platforms rather than their content: it circumvents US legal protection (Section 230) because it targets the product itself, not what users post. In Europe, however, there have not yet been any similar convictions against social media platforms for ‘addiction’, nor any case comparable to the Californian one; European courts have not yet recognised ‘social media addiction’ as an independent basis for civil liability. The technical and civil debate therefore remains open, with measures to be adopted and associations seeking a concrete and definitive solution.
ALL RIGHTS RESERVED ©