AI-created child sexual abuse images ‘threaten to overwhelm internet’::Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law

  • Bye@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    You don’t need the exact content you want in order to train a model (Lora) for SD. If you train on naked adults, and clothed kids, it can make some gross shit. And there are a lot more of those safe pictures out there to use for training. I’d bet my left leg that these models were trained that way.

    • MTK@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Why? If these people have access to these images why would you bet that they don’t use them?

      There are dark web sites that have huge sets of CSAM, why would these people not use that? What are you betting on? Their morals?