New Mexico is seeking an injunction to permanently block Snap from practices allegedly harming kids. That includes a halt on advertising Snapchat as “more private” or “less permanent” due to the alleged “core design problem” and “inherent danger” of Snap’s disappearing messages. The state’s complaint noted that the FBI has said that “Snapchat is the preferred app by criminals because its design features provide a false sense of security to the victim that their photos will disappear and not be screenshotted.”
https://cellebrite.com/en/ai-and-csam-a-look-at-real-cases/
Best I could find about this.
Imo as long as the ai was not trained on actual CSAM and the product is not depicting real people, then it shouldn’t be illegal as it is not hurting anyone which is why we have laws against CSAM in the first place.
What if it normalises CSAM and some people don’t discerne between real and AI?
And what if video games, movies, and books normalize killing? There is no evidence to show that it does or that it will.
While I’m not going to have this specific topic in my search history, sexually violent porn very likely does nothing to encourage actual sexual violence. Most studies show that it has no effect on sexual violence at all, some show it decreases it, and only a few studies show it increases it (and those ones tend to have smarter people than me saying they are flawed).
While media can have psychological effects, normalizing extreme behavior doesn’t seem to be one of them. That said, I wouldn’t trust an ai bro or their ai to handle something like that. At best they don’t know what goes into their training sets, at worst they would probably deliberately include csam.
We have porn games, but we don’t have CP games. There’s a line between violence and SA with minors.
Edit: oh wait, Japan might be an example 🙃 and yeah, they got issues.