• 0ddysseus@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    9 months ago

    (Apologies if I use the wrong terminology here, I’m not an AI expert, just have a fact to share)

    The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren’t responsible for what it is.

    Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?

    • mcgravier@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      Getting an AI image generator to produce CSAM means it knows what to show

      Not necessarily. Part of AI is blending different concepts. AI trained on images of regular children and nude adults in principle should be able to produce underage nudity. This is a side effect of the intelligence in the AI