Instagram Safety Feature for Minors

Instagram, owned by Mark Zuckerberg's Meta, is launching a new safety feature to blur and discourage sending nude images to minors in direct messages. The feature aims to protect children and create a safer environment on the platform.

Deepfake Images of Underage Celebrities

Deepfake images of underage celebrities' faces on nude bodies are easily found on top search engines and social media platforms, despite laws banning such material. Tech companies like Microsoft and Google have taken steps to remove specific content but similar nonconsensual images still appear. Legal experts point out challenges in enforcing laws against tech companies hosting such material.

Artificial Intelligence-Generated Nude Images of Students in Schools

Nude images of middle school students created by classmates using artificial intelligence in the Beverly Hills school district prompts calls for legislation. Similar incidents reported in other schools across the country.