Meta, the parent company for popular social media platforms Instagram, Facebook, and WhatsApp, has been found liable by a jury in New Mexico for compromising children’s safety online. The ruling, which was announced on Tuesday, has ordered Meta to pay $375 million in damages for violating New Mexico’s Unfair Practices Act.
The decision comes as a major blow to Meta, as it not only has to pay a hefty amount in damages but also faces a serious dent in its reputation. The Unfair Practices Act, which prohibits unfair, deceptive, and misleading business ventures, aims to protect consumers from companies that prioritize profit over their customers’ safety and well-being.
The ruling against Meta follows a long-standing concern about the safety of children on social media platforms. With the rise of cyberbullying and online predators, parents have become increasingly worried about their children’s online activities. And with more children gaining access to social media at a younger age, the need for stricter safety measures has become crucial.
The case against Meta was brought forward by the New Mexico Attorney General’s Office, which accused the tech giant of violating the Unfair Practices Act by not doing enough to protect children on its platforms. The jury found that Meta’s lax approach towards safeguarding children had put them at risk and, thus, held the company responsible for the damages caused.
The ruling has sent a clear message to all social media companies that they must take responsibility for the safety of their users, especially children. With billions of people using social media every day, it is a powerful tool that can influence and shape young minds. And as such, it is the company’s duty to ensure that their platforms are a safe space for everyone.
But this is not the first time Meta has faced criticism for its handling of children’s safety on its platforms. In 2019, a report by the Pew Research Center found that Instagram, which is owned by Meta, was the second most popular social media platform among teenagers, with 72% of them using it. However, the same report also highlighted that Instagram was the platform where teens were most likely to encounter cyberbullying and harassment.
Moreover, in 2020, a report by the National Society for the Prevention of Cruelty to Children (NSPCC) found that Facebook, another platform owned by Meta, was the most popular platform for online grooming and child sexual abuse. The report also stated that Facebook had made little progress in implementing safety measures to protect children on its platform.
The ruling against Meta serves as a wake-up call for all social media companies to prioritize the safety of their users, especially children. It is not enough for these companies to have policies and guidelines in place; they must actively enforce them and take swift action against any violations.
In response to the ruling, a spokesperson for Meta stated, “We are deeply committed to the safety and well-being of our users, including children. We take this responsibility seriously and will continue to invest in measures to keep our platforms safe.”
While it is commendable that Meta has acknowledged its responsibility towards the safety of its users, it is crucial for the company to take concrete actions to ensure that its platforms are safe for everyone. This includes implementing stricter measures to prevent cyberbullying, online grooming, and other forms of online abuse. It is also essential for the company to regularly review and update its policies to keep up with the ever-evolving landscape of social media and online safety.
The ruling against Meta in New Mexico is a step in the right direction towards promoting a safer and more responsible online environment. It is a reminder that companies must prioritize the well-being of their users, especially children, over their profits. Let us hope that this decision serves as a precedent for other states and countries to hold social media companies accountable for their actions and ensure the safety of their users.


