Law enforcement are prohibited from viewing leads of alleged child sexual abuse without a warrant if it was generated by AI
Social media companies relying on artificial intelligence software to moderate their platforms are generating unviable reports on cases of child sexual abuse, preventing US police from seeing potential leads and delaying investigations of alleged predators, the Guardian can reveal.
By law, US-based social media companies are required to report any child sexual abuse material detected on their platforms to the National Center for Missing & Exploited Children (NCMEC). NCMEC acts as a nationwide clearinghouse for leads about child abuse, which it forwards to the relevant law enforcement departments in the US and around the world. The organization said in its annual report that it received more than 32m reports of suspected child sexual exploitation from companies and the public in 2022, roughly 88m images, videos and other files.
Continue reading…
Law enforcement are prohibited from viewing leads of alleged child sexual abuse without a warrant if it was generated by AI
Social media companies relying on artificial intelligence software to moderate their platforms are generating unviable reports on cases of child sexual abuse, preventing US police from seeing potential leads and delaying investigations of alleged predators, the Guardian can reveal.
By law, US-based social media companies are required to report any child sexual abuse material detected on their platforms to the National Center for Missing & Exploited Children (NCMEC). NCMEC acts as a nationwide clearinghouse for leads about child abuse, which it forwards to the relevant law enforcement departments in the US and around the world. The organization said in its annual report that it received more than 32m reports of suspected child sexual exploitation from companies and the public in 2022, roughly 88m images, videos and other files. Continue reading…Technology | The Guardian