Meta is deploying artificial intelligence to detect minors on its platforms by analyzing physical characteristics like height and bone structure in user photos. The system operates as a visual analysis tool that flags accounts potentially belonging to underage users without requiring explicit age verification.

The social media giant rolled out the technology in select countries and plans broader expansion. Meta frames the initiative as a safety measure to enforce age restrictions across Instagram, Facebook, and Threads, where users must be 13 or older.

The approach bypasses traditional age verification methods, which often rely on ID documents or user self-reporting. Instead, the AI examines visual cues in profile pictures and other uploaded content to estimate whether someone falls below the platform's minimum age threshold.

Privacy advocates will likely scrutinize the system. Meta collects no explicit consent for this biometric analysis, and the company's history with age-gating enforcement raises questions about accuracy and false positives. Young-looking adults could face account restrictions, while sophisticated minors might circumvent detection.

The rollout reflects Meta's broader push toward AI-driven content moderation and user safety. The company already uses machine learning to identify illegal content, self-harm risks, and harassment. This marks the first major public deployment of AI analyzing physical body characteristics for age determination.

Meta hasn't disclosed which countries currently run the system or provided specifics on accuracy rates. The company typically tests safety features in limited geographies before wider release, allowing time to refine algorithms and address edge cases.

The initiative lands as regulators worldwide tighten scrutiny on tech platforms' protection of minors. The EU's Digital Services Act, UK Online Safety Bill, and various US state laws all impose obligations on platforms to safeguard children. Meta's AI system positions the company as proactive rather than reactive to these mandates.

Whether the approach actually prevents harmful underage access or simply creates friction for legitimate young users remains unclear. Meta will