Meta recently faced significant legal challenges, with two major court decisions that could have widespread implications for the tech giant. The New Mexico case marked a historic ruling where Meta was held liable for compromising child safety, ordering the company to pay up to $375 million for violations of the state’s Unfair Practices Act. Following this, a Los Angeles jury found Meta designed its apps to be addictive to minors, resulting in a combined liability of $6 million shared with YouTube for the case involving a plaintiff known as K.G.M.
These rulings could unleash a surge of lawsuits against Meta, as numerous similar cases, supported by 40 state attorneys general, are currently in litigation. Unlike previous protections that shielded social media platforms from responsibility for user content, these lawsuits focus instead on the design elements, such as persistent notifications and endless scrolling, that are argued to contribute to addiction and mental health problems among users, particularly teens.
Legal experts, including digital media lawyer Allison Fitzpatrick, suggest that this shift in focus, akin to tactics used against the tobacco industry, may create a compelling legal strategy. The outcome of these cases suggests a monumental shift in how tech companies can be held accountable.
Internal Meta documents released during the trials indicate the company’s awareness of the adverse effects of its platforms on young users, yet they pursued strategies to captivate teenage attention—often at the cost of their mental well-being. Previous reports have highlighted that around 12.5% of users use the platform in ways that have been flagged as problematic, with evidence showing that Facebook’s impact on well-being tends to be negative.
Meta maintains that the recent verdicts oversimplify the complexities of teen mental health, with a spokesperson arguing that teens benefit from online communities. However, critics point to internal communications displaying a culture focused on maximising user engagement, including references to achieving increased usage even in school settings.
Reactions to the cases have sparked discussions about children’s online safety regulations, particularly in light of past revelations from whistleblower Frances Haugen. While legislative proposals are underway to strengthen protections, there are concerns over potential resultant censorship and limitations placed on states and parents addressing these issues.
Kelly Stonelake, a former Meta employee and current critic of the Kids Online Safety Act, argues that legislative efforts must respect state rights and provide true solutions rather than merely addressing surface-level concerns. As the legal landscape continues to evolve, these cases could signal the beginning of major changes in how children’s online safety is prioritised and enforced.
Fanpage: TechArena.au
Watch more about AI – Artificial Intelligence


