New Mexico AG Raul Torrez on Meta's threat to remove Facebook, Instagram and WhatsApp from the state: Company showing the world how little it cares about….
New Mexico AG Raúl Torrez has responded to Meta 's threat to pull Facebook, Instagram , and WhatsApp from the state. According to a report by Fortune, Torrez recently said, “Meta is showing the world how little it cares about child safety. Meta’s refusal to follow the laws that protect our kids tells you everything you need to know about this company and the character of its leaders.”
These comments came after Meta said it might block access to its platforms in New Mexico if the state continues with its strict legal demands. The issue is linked to a lawsuit filed by the state against Meta, which is asking for major changes in how the company handles its platforms for minors. The case will go to a bench trial starting May 4, where a judge will decide whether to approve these changes.
The proposed measures include stricter age verification, requiring Meta to block users under 13, linking minor accounts to their guardians, and limiting interactions between adults and minors. The state, for example, is asking the court to restrict features such as recommendation algorithms, autoplay, and notifications during school or sleep hours and to set caps on time spent on the apps.
What Meta said about New Mexico’s proposed child safety changes
However, Meta has pushed back against these demands. In a statement to Fortune, the company said, “Despite Attorney General Torrez’s claims, the State’s demands are technically impractical, impossible for any company to meet and disregard the realities of the internet.” It added that targeting one platform ignores “the hundreds of other apps teens use.”
The company further warned, “While it is not in Meta’s interests to do so, if a workable solution to Attorney General Torrez’s demands is not reached, we may have no choice but to remove access to its platforms for users in New Mexico entirely.”
Torrez brushed off the warning as a “PR stunt,” pointing out that Meta has changed its systems before when it had to. “This is not a question of technical capability. “Meta just doesn’t prioritise children’s safety over engagement, ad revenue and profit,” he said.
The lawsuit, which was filed in 2023, stems from an undercover investigation by the New Mexico Department of Justice. Investigators created a fake profile pretending to be a 13-year-old and said they were immediately bombarded with unsolicited messages and explicit content from adults.
According to the state, Meta claimed that the exchanges were not involved in using the system's safety mechanisms, as required by the state.
In March 2026, a Santa Fe jury found Meta guilty of violating the Unfair Practices Act in 75,000 counts. This resulted in fines of $375 million for the company. This was perhaps the first instance where a state had won a judgment against a tech company on child-safety grounds.
The trial will decide if the court will order structural changes to Meta’s platforms in New Mexico, including oversight by a court-appointed monitor and potential restrictions on algorithmic recommendations for minors.
Meta has argued that the proposed rules could affect user experience and raise broader questions about free speech and parental control. The company said it has already put in place a number of safety measures and continues to be “committed to delivering safe, age-appropriate experiences.”
It’s one case among many as state attorneys general across the country increase regulatory pressure on social media firms over child safety and platform design.
Next Story