• Snowclone@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    14
    ·
    edit-2
    6 months ago

    That’s all well and good to remove them, but it solves nothing. At this point every easily accessible AI I’m aware of is kicking back any prompts with the names of real life people, they’re already antisipating real laws, preventing the images from being made in the first place isn’t impossible.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      5
      ·
      6 months ago

      Agreed. To me, making them is one thing, it’s like making a drawing at home. Is it moral? Not really. Should it be illegal? I don’t think so.

      Now, what this kid did, distributing them? Absolutely not okay. At that point it’s not private, and you could hurt their own reputation.

      This of course ignores the whole fact that she’s underage, which is on its own wrong. AI generated csam is still csam.

      • DashboTreeFrog@discuss.online
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        2
        ·
        6 months ago

        A friend in high school made nude drawings of another mutual friend. It was weird he showed me but he was generally an artsy guy and I knew he was REALLY into this girl and it was kind of in the context of showing he his art work. I reconnected with the girl years later and talked about this and while she said it was weird she didn’t really think much of it. Rather, the creepy part to her was that he showed people.

        I don’t think we can stop horny teens from making horny content about their classmates, heck, I know multiple girls who wrote erotic stories featuring classmates. The sharing (and realism) is what turns the creepy but kind of understandable teenage behavior into something we need to deal with

        • jacksilver@lemmy.world
          link
          fedilink
          arrow-up
          11
          arrow-down
          2
          ·
          6 months ago

          All I’m hearing is jailtime for Tina Belcher and her erotic friend fiction!

          But seriously, i generally agree that as long as people aren’t sharing it shouldn’t be a problem. If I can picture it in my head without consequence, seems kinda silly putting that thought on paper/screen should be illegal.

          • Scrubbles@poptalk.scrubbles.tech
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            2
            ·
            6 months ago

            Exactly, and it begs the question too, where’s the line? If you draw a stick figure of your crush with boobs is that a crime? Is it when you put an arrow and write her name next to it? AI just makes that more realistic, but it’s the same basic premise.

            Distributing it is where it crosses a hard line and becomes something that should not be encouraged.

            • retrospectology@lemmy.world
              link
              fedilink
              arrow-up
              4
              arrow-down
              4
              ·
              edit-2
              6 months ago

              It’s not some slippery slope to prohibit people generating sexual imagary of real people without their consent. The fuck is wrong with AI supporters?

              Even if you’re a “horny teenager” making fake porn of someone is fucking weird and not normal or healthy.

      • Fubarberry@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        3
        ·
        edit-2
        6 months ago

        AI generated csam is still csam.

        Idk, with real people the determination on if someone is underage is based on their age and not their physical appearance. There are people who look unnaturally young that could legally do porn, and underage people who look much older but aren’t allowed. It’s not about their appearance, but how old they are.

        With drawn or AI-generated CSAM, how would you draw that line of what’s fine and what’s a major crime with lifelong repercussions? There’s not an actual age to use, the images aren’t real, so how do you determine the legal age? Do you do a physical developmental point scale and pick a value that’s developed enough? Do you have a committee where they just say “yeah, looks kinda young to me” and convict someone for child pornography?

        To be clear I’m not trying to defend these people, but it seems like trying to determine what counts legal/non-legal for fake images seems like a legal nightmare. I’m sure there are cases where this would be more clear cut (if they ai generate with a specific age, trying to do deep fakes of a specific person, etc), but a lot of it seems really murky when you try to imagine how to actually prosecute over it.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          6
          ·
          6 months ago

          all good points, and I’ll for sure say that I’m not qualified enough to be able to answer that. I also don’t think politicians or moms groups or anyone are.

          All I’ll do is muddy the waters more. We as the vast majority of humanity think CSAM is sick, and those who consume it are not healthy. I’ve read that psychologists are split. Some think AI generated CSAM is bad, illegal, and only makes those who consume it worse. Others, however, suggest that it may actually curb urges, and ask why not let them generate it, it might actually reduce real children from being actually harmed.

          I personally have no idea, and again am not qualified to answer those questions, but goddamn did AI really just barge in without us being ready for it. Fucking big tech again. “I’m sure society will figure it out”

      • bane_killgrind@slrpnk.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Reputation matters less than harassment. If these people were describing her body publicly it would be a similar attack.

      • Snowclone@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        3
        ·
        edit-2
        6 months ago

        If you really must, you can simply have the AI auto delete nsfw images, several already do this. Now to argue you can’t simply never generate or give out nsfw images, you can also gate nsfw content generation behind any number of hinderences that are highly effective against anonymous use, or underage use.

        • i_am_not_a_robot@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          4
          ·
          6 months ago

          Modern AI is not capable of this. The accuracy for detecting nsfw content is not good, and they are completely incapable of detecting when nsfw content is allowable because they have no morals and they don’t understand anything about people or situations besides appearance.

          • jacksilver@lemmy.world
            link
            fedilink
            arrow-up
            5
            ·
            6 months ago

            These models are also already open sourced, you can’t stop it. Additionally all of those “requirements” would just mean that AI would only be owned by big corporations.

            • reev@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              6
              ·
              6 months ago

              I think that’s one thing that’s often forgotten in these conversations. The cat’s out of the bag, you will never be able to stop the generation of things as they are right now. You’ll just be able to punish people for doing so.

      • Snowclone@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        6 months ago

        The current method is auto deleting nsfw images. Doesn’t matter how you got there, it detects nsfw it dumps it, you never get an image. Besides that gating nsfw content generation behind a pay wall or ID wall. It would stop a lot of teenagers. Not all, but it would put a dent in it. There are also AI models that will allow some nsfw if it’s clearly in an artistic style, like a watercolor painting, but will kick nsfw realism or photography, rendered images, that sort of thing. These are usually both in the prompt mode, paint in/out, and image reference mode, generation of likely nsfw images, and after generating a nsfw check before delivering the image. AI services are antisipating full on legal consequences for allowing any nsfw or any realistic, photographic, cgi, image of a living person without their consent, it’s easy to see that’s what they are prepared for.

        • cryptiod137@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          6 months ago

          I assume someone who is currently generating AI porn is running a model locally and not using a service, as there is absolute boat loads of generated hentai getting pised every day?

        • i_am_not_a_robot@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          6
          ·
          6 months ago

          Web services and AI in general are completely different things. Web services that generate AI content want to avoid scandals so they’re constantly blocking things that may be in some situations inappropriate, to the point where those services are incapable of performing a great many legitimate tasks.

          Somebody running their own image generator on their own computer using the same technology is limited only by their own morals. They can train the generator on content that public services would not, and they are not constrained by prompt or output filters.

        • bane_killgrind@slrpnk.net
          link
          fedilink
          English
          arrow-up
          6
          ·
          6 months ago

          Sure for some tools. There are other tools that don’t do that.

          Chasing after the tools and services is a waste. Make harassment more clearly defined, go after people that victimize other people.

    • retrospectology@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Depending on the AI developers to stop this on their own is a mistake. As is preemptively accepting child porn and deepfakes as inevitable rather than attempting to stop or mitigate it.