A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • aesthelete@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    5
    ·
    edit-2
    2 months ago

    It’s already illegal to do basically most of the things leading up to a murder. You’re not allowed to conspire to commit one, stalk your target, break into a building, torture animals, kidnap, hire a hitman, etc.

    • garpujol@discuss.online
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      2 months ago

      TV and movies should not be able to show crimes, because images depicting crimes should be illegal.

      (I’m just illustrating the slippery slope criminalizing “artistic” renderings)

      • aesthelete@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        7
        ·
        edit-2
        2 months ago

        I’m not advocating for what you’re saying here at all.

        So there you go, your slope now has gravel on it.

        EDIT: This dude was arrested using today’s laws, and I’m pretty sure the series Dexter is still legal to write, direct, film, and view. So your slippery slope is a fallacious one (as most of them tend to be in my experience).

          • aesthelete@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            2 months ago

            It should be illegal for a number of reasons. One is a simple practical one: as the technology advances towards increasing levels of realism it’ll become impossible for law enforcement to determine what material is “regular” CSAM versus what material is “generated” CSAM.

            So, unless you’re looking to repeal all laws against possession of CSAM, you’ll have a difficult time crafting a cut-out for generated CSAM.

            And honestly, why bother? What’s the upside here? To have pedos get a more fulfilling wank to appease them and hope they won’t come after your kids for real? I really doubt the premise behind that one.

            • Cryophilia@lemmy.world
              link
              fedilink
              arrow-up
              6
              ·
              2 months ago

              And honestly, why bother? What’s the upside here?

              Allowing for victimless crimes simply because a group is undesirable is a terrible precedent. We can’t ban things just because they make us uncomfortable. Or because it makes law enforcements job easier.

              • aesthelete@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                5
                ·
                2 months ago

                Allowing for victimless crimes simply because a group is undesirable is a terrible precedent.

                I wouldn’t even call it victimless, and we have all kinds of actual victimless crimes that are already illegal so I don’t care about supposedly setting this “precedent” that has already been set a million times over.

                • Cryophilia@lemmy.world
                  link
                  fedilink
                  arrow-up
                  6
                  arrow-down
                  1
                  ·
                  2 months ago

                  and we have all kinds of actual victimless crimes that are already illegal

                  And we should be undoing those laws

                  • aesthelete@lemmy.world
                    link
                    fedilink
                    arrow-up
                    1
                    arrow-down
                    4
                    ·
                    edit-2
                    2 months ago

                    We aren’t though, so it’s frankly pretty odd that you’re fixated on this one.

                    It’s frankly pretty odd that Lemmy in general seems to be fairly pro-generated-CSAM. I’m betting you guys are just afraid of the feds finding your stashes.

                    EDIT: I basically get maybe three replies a week to things I post on here, except when I post something about being okay with generated CSAM or deepfake porn being illegal (in which case I get binders full of creeps in my inbox).

    • homura1650@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      2 months ago

      But you are allowed to stage a murder; hire a film crew to record your staged murder; pay television stations and websites to show a preview of your staged murder, and sell a recording of your staged murder to anyone who wants to buy. Depending on how graphic it is, you might be required to put an age advisory on your work and not broadcast it on public airwaves; but that is about the extent of the regulation.

      You can even stage murders of children and show that.

      Even if the underlying murder is real, there is still no law outlawing having a recording of it. Even producing a recording of a murder isn’t technically illegal; although depending on the context you might still be implicated in a conspiracy to commit murder.

      Sexual assult of children is the only crime for which the mere depiction of the crime is itself a crime.

      • aesthelete@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        5
        ·
        edit-2
        2 months ago

        Sexual assult of children is the only crime for which the mere depiction of the crime is itself a crime.

        Okay. Why should I care that this is an exception case?

        Laws aren’t derived from clean general principles by the gods on Mt. Olympus.

        • aesthelete@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          3
          ·
          edit-2
          2 months ago

          Another thing that you “can’t we think of the poor, helpless pedos that want to nut?” people don’t seem to think of is that if AI CSAM grows increasingly more realistic and we carve out an exception for it, how can you enforce laws against non-generated CSAM? You’d have to have some computer forensics asshole involved in every case to prove whether or not the images are generated, which would likely produce a chilling effect over time on regular CSAM cases, and all of this for the “societal benefit” of allowing pedos a more gratifying wank.

          • Cryophilia@lemmy.world
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            2 months ago

            For the societal benefit of not ceding our rights to government.

            Today it’s pedophilia, but what about if Trump wins and legally designates trans people as pedophiles?

            This is a power we cannot allow the government to have.

            • aesthelete@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              3
              ·
              2 months ago

              For the societal benefit of not ceding our rights to government.

              The right to create realistic looking CSAM is not a right I give a shit about having.

              Today it’s pedophilia, but what about if Trump wins and legally designates trans people as pedophiles?

              What if a lawless dictator does crazy things? I’m not sure the law (or lack thereof) will have anything to do with that scenario.

              The whole idea of “if X is illegal, then the government will use the law against X against us” only matters in the context of a government following the law.

              If they’re willing to color outside the lines and the courts say the law doesn’t apply to them then it doesn’t matter one fucking bit.

              • Cryophilia@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                2 months ago

                I’m not suggesting they’ll abandon the rule of law. I’m suggesting they’ll use this as precedent to legally oppress us.

                  • Cryophilia@lemmy.world
                    link
                    fedilink
                    arrow-up
                    3
                    arrow-down
                    1
                    ·
                    2 months ago

                    Which is a terrible excuse for allowing bad laws. Sure, ok, let’s assume you’re right and this specific group will break the law anyway. Maybe let’s prepare for other tyrannical groups?