More than a thousand images of child sexual abuse material were found in a massive public dataset used to train popular AI image-generating models, Stanford Internet Observatory researchers said in a ...
More than 1,000 known child sexual abuse materials (CSAM) were found in a large open dataset—known as LAION-5B—that was used to train popular text-to-image generators such as Stable Diffusion, ...