A new lawsuit filed in Tennessee is raising urgent questions about what happens when powerful generative AI tools are accused ...
Children’s rights groups and tech companies are fuming after EU legislators could not agree to extend rules allowing tech companies to continue scanning the internet for child sexual abuse material ...
A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As ...
A group of bipartisan senators are said to have asked Meta to explain Instagram's alleged failure to prevent child sexual abuse material (CSAM) from being shared among networks of pedophiles on the ...
Elon Musk's Grok AI has been allowing users to transform photographs of woman and children into sexualized and compromising images, Bloomberg reported. The issue has created an uproar among users on X ...
On Friday, Sens. Marsha Blackburn (R-Tenn) and Richard Blumenthal (D-Conn) sent co-written letters to Amazon, Google, Integral Ad Science, DoubleVerify, the MRC and TAG notifying the companies that ...
In August 2021, Apple announced a plan to scan photos that users stored in iCloud for child sexual abuse material (CSAM). The tool was meant to be privacy-preserving and allow the company to flag ...
The European Union has formally presented its proposal to move from a situation in which some tech platforms voluntarily scan for child sexual abuse material (CSAM) to something more systematic — ...
Elon Musk's Grok AI has been allowing users to transform photographs of woman and children into sexualized and compromising images, Bloomberg reported. The issue has created an uproar among users on X ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results