AI Deepfake Nude Images Hit 600+ Students Across 90 Schools in 28 Countries as Crisis Spirals
Summary
A deepfake nude image crisis is spiraling across 90 schools in 28 countries, impacting 600+ students since 2023, with Unicef estimating 1.2 million children victimized last year as teenage boys exploit 'nudify' apps, leaving schools and law enforcement struggling to respond while new legislation like the Take It Down Act demands platforms remove nonconsensual intimate images within 48 hours.
Key Points
- A joint analysis reveals that AI-generated deepfake nude images have impacted nearly 90 schools and over 600 students across at least 28 countries since 2023, with teenage boys predominantly responsible for creating the explicit material using easily accessible 'nudify' apps.
- The crisis is believed to be far more widespread than reported, with Unicef estimating 1.2 million children had sexual deepfakes created of them last year, while victims suffer severe psychological harm including humiliation, fear, and reluctance to attend school.
- Schools and law enforcement agencies remain largely unprepared and inconsistent in their responses, though victims and their families are increasingly fighting back, contributing to new legislation like the Take It Down Act, which requires platforms to remove nonconsensual intimate images within 48 hours.