• ssj2marx@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    6 months ago

    introducing the AI transparency act, which requires every generative prompt to be registered in a government database

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 months ago

      and that’s what I loathe about the idiots who are for this stuff. Yes, I want to curb this stuff - but for fuck’s sake there are ways to do it that aren’t “Give big government every scrap of data on you”.

      There are ways to prove I’m over 18 without needing to register my ID with a porn company, or to regulate CSAM while not having to read private messages. Fuck, but we have the combination of circle of a venn diagram of idiot and control freak in congress, and they’ll happily remove all of our rights over some fear of the boogeyman

    • Raphaël A. Costeau@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 months ago

      I don’t see a problem with that, I think that this information should be public, both prompt and result, because:

      • a. The “AIs” companies already know that, why shouldn’t anyone else?
      • b. They use public information to train their models, thus their results should also be public.
      • c. This would be the ultimate way to know that something was “AI” generated.

      This is a very different subject from giving acess for your DMs. The only ones who benefit from this information not being publicly available are those who use “AI” for malicious purposes, while everyone benefits from privacy of correspondence.

      • ssj2marx@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        I suppose you would also be fine with every one of your google searches being in a database? Every video you’ve ever watched, even the ones in private browser tabs?

        • Raphaël A. Costeau@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 months ago

          No, and that’s why I don’t use Google or anything that isn’t encrypted and sends any data that I consider private to some datacenter. And even when I know the data is encrypted, I am careful, as anyone should be, with data leaving your computer and going to someone else’s.

          “AI” is not the same thing. Why would I want my prompt to be private if I don’t want to use the result in some malicious way, be it generating CSAM or using it to cheat someone to write an article, or to generate a Deep Fake video of someone for an internet scam?

          • ssj2marx@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            6 months ago

            Why would I want my prompt to be private if I don’t want to use the result in some malicious way

            Do you think that the only thing people use AI for is making deepfakes and CSAM? AFAIK the most common use is generating porn. Now, I don’t think generating regular porn is “malicious”, but I certainly understand why most people (self included) want to keep what they generate private.

            • Raphaël A. Costeau@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 months ago

              I don’t think people’s right to generate whatever image they want to jerk off to is fundamental or more important than avoiding “AI” scams and CSAM generation. There are other ways to jerk off: there’s plenty of real people porn online and also lots, lots, lots of hentai, for literally every taste. “AI” porn only has two particularities that are not satisfied by these two options, one is to generate the scene you want, and for the very remote possibility that what you have imagined has never been produced before, you can pay an artist to To do so, another is Deep Fake porn, which should be a crime, it doesn’t matter if you’re not going to publish the image.