EU Launches Formal Probe into Meta for Child Safety Concerns

EU Launches Formal Probe into Meta for Child Safety Concerns

The European Commission has launched an investigation into Meta, the parent company of social media platforms Facebook and Instagram, over concerns that these platforms may be promoting addictive behavior in children and failing to protect their mental health. The probe is centered on potential violations of the Digital Services Act (DSA), which holds digital companies accountable for addressing illegal content, including disinformation, shopping scams, and child abuse.

The investigation will scrutinize the so-called "rabbit hole" effect, where algorithms may feed young users negative content that could harm their body image and mental well-being. The effectiveness of Meta's age verification tools and privacy measures for minors are also under examination. This follows the EU's previous formal proceedings against Meta, questioning its efforts to counteract Russian disinformation ahead of the European Parliament elections.

The EU's inquiry into Meta's adherence to the DSA is part of a broader focus on the compliance of tech giants with the bloc's stringent online safety regulations, especially concerning children. This could potentially result in substantial fines for Meta if it is found to be non-compliant with the DSA's requirements. The European Commission's action reflects an increasing regulatory scrutiny on the practices of large technology companies in the context of user safety and the integrity of online information.


Other news in business