In this video I discuss how generative AI technology has grown far past the governments ability to effectively control it and how the current legislative measures could lead to innocent people being jailed.

  • Ignotum@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    2
    ·
    1 year ago

    AI generated nudes of noone in particular isn’t hurting anyone, not directly at least, but AI generated nudes of a specific person, using that persons likeness and everything, that’s much worse

    AI can generate faces of people that don’t actually exist, that’s what i mean

    The post made it seem like it was about AI generated CSAM in general, which while disgusting, doesn’t directly harm anyone, but then the comments spoke about AI generated CSAM depicting a real individual, and that’s much worse, but also not a problem that’s specific to children

    • Neato@kbin.social
      link
      fedilink
      arrow-up
      4
      arrow-down
      15
      ·
      1 year ago

      AI CSAM is incredibly harmful. All CSAM is harmful. It’s been shown to increase chance of pedophilic abuse.

      Stop defending CSAM, HOLY SHIT.

      • Helix 🧬@feddit.de
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        1 year ago

        It’s been shown to increase chance of pedophilic abuse.

        Can you link me a source for that, please?

      • Ignotum@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        Jeez, calm down

        I am not defending CSAM, just saying that CSAM depicting an actual existing child is magnitudes worse, as is any other kind of fabricated sexual content of real people.

        Take loli porn for example, it’s probably bad for society, but if someone makes loli porn based on the appearance of an actual individual, that’s much more fucked up, and in addition to the “normal” detrimental effects, that would also harm that victim in a much more direct way.