Last month, a detective in a small town outside of Lancaster, Pennsylvania, invited dozens of high school girls and their parents to the police station to undertake a difficult task: one by one, the girls were asked to confirm that they were depicted in hundreds of AI-generated deepfake pornographic images seized by law enforcement.

In a series of back-to-back private meetings, Detective Laurel Bair of the Susquehanna Regional Police Department slid each image out from under the folder’s cover, so only the girl’s face was shown, unless the families specifically requested to see the entire uncensored image.

“It made me a lot more upset after I saw the pictures because it made them so much more real for me,” one Lancaster victim, now 16, told Forbes. “They’re very graphic and they’re very realistic,” the mother said. “There’s no way someone who didn’t know her wouldn’t think: ‘that’s her naked,’ and that’s the scary part.” There were more than 30 images of her daughter.

The photos were part of a cache of images allegedly taken from 60 girls’ public social media accounts by two teenage boys, who then created 347 AI-generated deepfake pornographic images and videos, according to the Lancaster County District Attorney’s Office. The two boys have now been criminally charged with 59 counts of “sexual abuse of children,” and 59 counts of “posession of child pornography,” among other charges, including “possession of obscene materials depicting a minor.”

  • orclev@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    hace 15 días

    This has always been one of the problems with CSAM laws. There have been a number of cases now where minors were charged with CSAM possession for either naked pictures of themselves, or pictures (consensual) of their girlfriend/boyfriend who was also a minor. There’s also the broader discussion about what exactly qualifies as CSAM, with some jurisdictions going for a more maximalist approach that considers things like drawings (even highly unrealistic or stylized ones) of people or even fictional characters to be CSAM. Some jurisdictions don’t even require the photo or drawing to depict the minor naked or even engaging in a sexual act, they instead define it as pornography if the person in possession of it gets some kind of sexual gratification from it. So for instance a photo of a minor that’s fully clothed and just standing there could actually be considered CSAM.

    The problem is that it’s hard to draw hard lines about what does or doesn’t qualify without then leaving loopholes that can be exploited. This is why many jurisdictions opt for a maximalist approach and then leave it to the discretion of the police and prosecutors for what they do or do not consider, but of course that has the flaw that it’s entirely arbitrary and leaves a lot of power in the hands of prosecutors and police for something widely regarded as a extremely serious crime.