The European Commission announced Thursday it was opening an investigation into U.S. social media giant Meta over potential violations of the 27-nation bloc’s online child safety regulations.
The commission said Thursday it decided to go ahead with the investigation over concerns that Meta was not adequately addressing child safety on its platforms Facebook and Instagram. Under the EU’s Digital Services Act, platforms must rigorously police illegal or potentially harmful content or risk heavy fines.
“The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioral addictions in children, as well as create so-called ’rabbit-hole effects,’” the European Commission said in a statement.
The commission also said it plans to evaluate the effectiveness of Meta’s age verification systems.
In response to the investigation, Menlo Park, Calif.-based Meta maintained that it wants its platforms to be safe for all ages and that it looks forward to working with the European Commission during its investigation.
Meta has come under increased scrutiny over the past few years for the effect its platforms can have on children. Lawmakers and state attorneys general have sued Meta for its inability to limit child exploitation and protect users’ privacy.
In the EU, the platform is currently dealing with another investigation by the commission over its policing of content relating to the upcoming U.S. presidential election and the Israel-Hamas war.
• Vaughn Cockayne can be reached at vcockayne@washingtontimes.com.
Please read our comment policy before commenting.