Meta Deploys AI to Detect and Remove Users Under 13 on Facebook and Instagram Amid $375M Child Safety Verdict
Summary
Meta deploys AI on Facebook and Instagram to detect and remove users under 13 by analyzing bone structure, height, and visual cues in photos, just days after a New Mexico jury orders the company to pay $375 million for failing to protect minors on its platforms.
Key Points
- Meta is deploying an AI system on Facebook and Instagram that analyzes bone structure, height, and visual cues in photos and videos to detect and remove users under the age of 13.
- Meta insists the technology is not facial recognition, as it does not identify specific individuals, but it also scans posts, comments, bios, and captions for contextual clues indicating a user may be underage.
- The rollout comes days after a New Mexico jury ordered Meta to pay $375 million for failing to protect children on its platforms, and Meta is also expanding Teen Account protections for users aged 13 to 17.