Meta Proposes EU-Wide Standards for Online Teen Safety

Spread the love

A Call for Unified Age Verification Standards

On Tuesday, Meta, the tech giant based in the U.S., presented the European Commission with a proposal aimed at establishing a cohesive age verification and safety standard system for apps and online services. This initiative seeks to bolster protections for teenagers throughout the European Union. It arrives at a time when incoming Commissioners have indicated that enhancing child safety online is a key priority for their upcoming tenure.

Enhancing Parental Controls and Age Verification

Meta’s proposal suggests that app stores should implement measures that require parental approval for downloading apps by users under the age of 16. If a minor attempts to download an app, their parents would receive notifications to approve or deny the action. This strengthens the role of parental oversight and ensures that young users engage with age-appropriate content.

Striving for Consistent Industry Standards

Alongside age verification, Meta advocates for EU-wide industry standards that define what constitutes age-appropriate experiences for adolescents. In their statement, the company compares this alignment to existing age classifications in other media formats, like films and video games. They also emphasize the need for specific supervision tools on social media platforms, which would allow parents to monitor and control the interactions of their children aged under 16.

Addressing Fragmented Regulations

Meta’s global head of safety, Antigone Davis, pointed out the current disarray in European regulations surrounding youth safety online. She stressed that the absence of consistent protections across the 27 EU member states creates vulnerabilities for young users. The company urges for comprehensive EU-wide regulations that mandate age verification for minors and parental consent for app downloads, aiming to provide uniform safety measures throughout the region.

Supporting Initiatives to Protect Minors

While discussions continue regarding the proposed regulation on Child Sexual Abuse Material (CSAM) in the Council of the EU, the focus remains on establishing effective measures for identifying and protecting minors online. Existing regulations, such as the Digital Services Act (DSA) and the Audiovisual Media Services Directive (AVMSD), hint at the necessity for improved age verification practices. Meta’s push for standardized safety measures highlights its commitment to creating a safer online environment for teenagers in Europe.


SOURCE: Ref Image from Reuters

Views:1021 2
Website | + posts

Whether writing about complex technical topics or breaking news stories, my writing is always clear, concise, and engaging. My dedication to my craft and passion for storytelling have earned me a reputation as a highly respected article writer.


Spread the love