• ABCDE@lemmy.world
    link
    fedilink
    English
    arrow-up
    193
    arrow-down
    24
    ·
    3 months ago

    It’s just abuse. You deliver something for the production and the story, and then you end up being molested that way,” Jensen said.

    Abuse? Molested? This is a load of shit. If they don’t want to be seen nude then don’t be filmed nude.

    • the post of tom joad@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      85
      arrow-down
      7
      ·
      3 months ago

      Yeah sorry this isn’t molestation, this is how media works in a digital connected world. The fact the entertainment industry has been denying the realities of their industry for 50 some odd years causes some head-scratching takes, like an actress being “molested” by someone jerking off to her in a scene they didnt pay for. No, sorry lady, once you expose the flesh to the camera you relinquish the right to keep it to yourself.

      These stupid goddamn articles always trot out some actress or stagehand to humanize the ‘victims’ too, when its really just rich production companies losing money. Fuck everything about this.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        65
        arrow-down
        3
        ·
        3 months ago

        Regretting doing a nude scene is humanizing. Making your regret someone else’s problem is just shitty all around.

    • SirEDCaLot@lemmy.today
      link
      fedilink
      English
      arrow-up
      46
      arrow-down
      7
      ·
      3 months ago

      This is also why I am thankful for the American first amendment.
      It would appear that in that user’s country, it is considered additionally a crime to take the nude scene out of context.
      I think a lot of people online take freedom of speech for granted, not realizing that many supposedly civilized countries have an increasing number of restrictions on unpopular speech, critical speech, or otherwise undesirable speech like this.

      • General_Effort@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        2
        ·
        3 months ago

        Make no mistake. The US is heading in the same direction. Look at the proposed anti-deepfake laws. That guy could be prosecuted extremely harshly under those.

        • SirEDCaLot@lemmy.today
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          3
          ·
          3 months ago

          It will be interesting to see that tested in court. I don’t think anyone would complain about for example a pencil sketch of a naked celebrity, that would be considered free speech and fair use even if it is a sketch of a scene from a movie.

          So where does the line go? If the pencil sketch is legal, what if you do a digital sketch with Adobe illustrator and a graphics tablet? What if you use the Adobe AI function to help clean up the image? What if you take screen grabs of a publicity shot of the actor’s face and a nude image of someone else, and use them together to trace the image you end up painting? What if you then use AI to help you select colors and help shading? What if you do each of those processes individually but you have AI do each of them? That is not very functionally different from giving an AI a publicity shot and telling it to generate a nude image.

          As I see it, The only difference between the AI deepfake and the fake produced by a skilled artist is the amount of time and effort required. And while that definitely makes it easy to turn out an awful lot of fakes, it’s bad policy to ban one and not the other simply based on the process by which the image was created.

          • jacksilver@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            3 months ago

            It’s messy legislation all around. When does it become porn vs art vs just erotic or satirical? How do you prove it was a deep fake and not a lookalike? If I use a porn actress to make a deep fake is that also illegal or is it about how the original source content was intended to be used/consumed?

            I’m not saying that we should just ignore these issues, but I don’t think any of this will be handled well by any government.

            • General_Effort@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 months ago

              That’s easy. The movie studios know what post-production went into the scenes and have the documents to prove it. They can easily prove that such clips fall under deepfake laws.

              Y’all need to be more cynical. These lobby groups do not make arguments because they believe in them, but because it gets them what they want.

              • jacksilver@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 months ago

                I was responding to an above comment. The guy who was arrested in op’s article was posting clips from movies (so not deep fakes).

                That being said, for deepfakes, you’d need the original video to prove it was deepfaked. Additionally, you’d then probably need to prove they used a real person to make the deep fake. Nowadays it’s easy to make “fake” people using AI. Not sure where the law sits on creating deepfakes of fake people who resemble other people.

                • General_Effort@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  3 months ago

                  I didn’t make the point clear. The original scenes themselves, as released by the studio, may qualify as “deepfakes”. A little bit of digital post-processing can be enough to qualify them under the proposed bills. Then sharing them becomes criminal, fair use be damned.

            • SirEDCaLot@lemmy.today
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              3 months ago

              Actually I was thinking about this some more and I think there is a much deeper issue.

              With the advent of generative AI, photographs can no longer be relied upon as documentary evidence.

              There’s the old saying, ‘pics or it didn’t happen’, which flipped around means sharing pics means it did happen.

              But if anyone can generate a photo realistic image from a few lines of text, then pictures don’t actually prove anything unless you have some bulletproof way to tell which pictures are real and which are generated by AI.

              And that’s the real point of a lot of these laws, to try and shove the genie back in the bottle. You can ban deep fake porn and order anyone who makes it to be drawn in quartered, you can an AI watermark it’s output but at the end of the day the genie is out of the bottle because someone somewhere will write an AI that ignores the watermark and pass the photos off as real.

              I’m open to any possible solution, but I’m not sure there is one. I think this genie may be out of the bottle for good, or at least I’m not seeing any way that it isn’t. And if that’s the case, perhaps the only response that doesn’t shred civil liberties is to preemptively declare defeat, acknowledge that photographs are no longer proof of anything, and deal with that as a society.

              • jacksilver@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 months ago

                One solution that’s been proposed is to cryptographic ally sign content. This way someone can prove they “made” the content. It doesn’t prove the content is real, but means you can verify the originator.

                However, at the end of the day, you’re still stuck with needing to decide who you trust.

                • SirEDCaLot@lemmy.today
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  3 months ago

                  Probably the best idea yet. It’s definitely not foolproof though. Best you could do is put a security chip in the camera that digitally signs the pictures, but that is imperfect because eventually someone will extract the key or figure out how to get the camera to sign pictures of their choosing that weren’t taken by the camera.

                  A creator level key is more likely, so you choose who you trust.

                  But most of the pictures that would be taken as proof of anything probably won’t be signed by one of those.

              • interdimensionalmeme@lemmy.ml
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                3 months ago

                I’m fine with photos don’t prove anything.

                Let statists cry about that one, cry little statists that you can’t inflict pain in that justified way that you love so much.

                • SirEDCaLot@lemmy.today
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  2 months ago

                  I’m not fine with that, as it will have wide-ranging repercussions on society at large that aren’t all good.

                  But I fully accept it as the cold hard reality that WILL happen now that the genie’s out of the bottle, and the reality that any ham-fisted legal attempt to rebottle the genie will be far worse for society and only delay the inevitable acceptance that photographs are no longer proof.

                  And as such, I (and most other adults mature enough to accept a less-than-preferred reality as reality) stand with you and give the statists the middle finger, along with everyone else who thinks you can legislate any genie back into its bottle. In the 1990s it was the ‘protect kids from Internet porn’ people, in the 2000s it was the ‘protect kids from violent video games’ and ‘stop Internet piracy’ people, I guess today it’s the ‘stop generative AI’ people. They are all children who think crying to Daddy will remake the ways of the world. It won’t.

            • Rekorse@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              3 months ago

              Same as anything else, if it causes someone harm (in american financial harm counts) it gets regulated.

              There are exceptions to allow people to disregard laws as well. Its legal to execute a death row prisoner.

          • Rekorse@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 months ago

            One is banned because it can affect someone’s earnings, and is theft, the other is not banned because noone is harming another party by making a pencil drawing of a celebrity or scene.

            • SirEDCaLot@lemmy.today
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              3 months ago

              I’m not talking about the copyright violation of sharing parts of a copyrighted movie. That is obviously infringement. I am talking about generated nude images.

              If the pencil drawing is not harming anybody, is the photo realistic but completely hand-done painting somehow more harmful? Does it become even more harmful if you use AI to help with the painting?

              If the pencil drawing is legal, and the AI generated deep fake is illegal, I am asking where exactly the line is. Because there is a whole spectrum between the two, so at what point does it become illegal?

              • Rekorse@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 months ago

                It becomes harmful once you start selling it for profit based on its similarity to a real person.

                Just so you know that does happen quite a lot on a small scale. Copyright law tends to be applied once a business pattern is established around the problematic content.

                Some people get away with it selling at craft fairs and such and just noone ever hears about it.

      • BrianTheeBiscuiteer@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        3 months ago

        I also would’ve expected nudity to be less taboo there. Would it have been just as likely to be arrested for sharing fully clothed still shots? That would actually make a lot more sense: distribution of copyrighted, non-promotional material.

        • SirEDCaLot@lemmy.today
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          3
          ·
          3 months ago

          I think the issue isn’t nudity but sexualization-- IE nude scene in context of a film is fine, chopping the nude scene out of the film is basically turning the actress into a porn star and that’s not fine. Same attitude is why the actress called it molestation. Different attitude as a society I guess.

          • stravanasu@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            3 months ago

            It seems to me these scenes are introduced in films to sexualize them. Most often than not they don’t add anything to the story. But blood & sex get more viewers. So I find the whole thing hypocritical.

            Brings me to mind an episode of the hilarious series “Coupling”, where Jeff says that the actress in the film “The Piano” (?) was naked in the whole film. His friends say she wasn’t, it was only a scene in the film. And Jeff replies “it depends on how you watch it” 🤣

            • SirEDCaLot@lemmy.today
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              2 months ago

              I agree it’s hypocritical, but for different reasons.

              I think a nude/sex scene can be important to the plot and add a lot to the story- in some situations. Yeah it’s often thrown in as eye candy to get more viewers, but sometimes it counts for a lot. Look at Season 1 of Game of Thrones for example- there’s a couple sex scenes with Dany and Khal Drogo, and IMHO that does a lot more to further the story than to show T&A-- the first one Dany’s basically being raped, but as the season goes on you see her start to fall in love with Drogo and it becomes more making love. Hard to get the same effect without sex scenes.
              Same thing anytime you have two people in bed- crappy unrealistic TV sex where the girl never takes her shirt off and then cut to half a second later they’re both wrapped tightly but conveniently in sheets can break suspended disbelief.
              So I can sympathize with an actor who agrees to artistic nude scenes or sex scenes because they’re important to the plot, but then has that specific 20 seconds of video taken out of context and circulated on porn sites.

              At the same time, an actor doesn’t get to order the audience to experience the film in any certain way. Just as you say about ‘the piano’, it depends on how you watch it. It’s not illegal to buy the film, fast forward to the nude scenes, and stop watching when they’re done. So to think you get any sort of control over that is hypocritical, it’s like ordering a reader to read the entire book and not share passages with a friend.

              • stravanasu@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 months ago

                Personally I disagree on value of sex/nude scenes – but it’s a subjective matter of course. Your final argument is absolutely fair and logical, and very general too. Extremely well put – I subscribe 110% to it!

    • XeroxCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      ·
      3 months ago

      Given the country in question, the actress’ nationality, and the writers name, this may be a simple translation misalignment. Molest doesn’t have the same weight in every language/region. Molest is more akin to “bother” in Spanish, for example.

      • BigDanishGuy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        3 months ago

        I’m pretty sure she means molested as in abused or misused.

        On a side note, as to the molesting aspect, I can’t see how this is any different than 14yo me having a couple of VHS tapes where the boob-parts were more worn than the rest of the tape. Or going frame by frame on the Taxi 2 DVD in the scene where Petra does kung fu in a skirt. Or golden child where the mythical woman behind the curtain is revealed. Or the shower/locker room scene in robocop. Or the bath scene in coming to America. Or the playboy mansion in Beverly Hills cop 2… Anyway I digress.

        If you appear naked in a movie, expect teenage boys and creepy men to jerk off to you. Not the perfect world would be, I know. In the perfect world everyone would respect that the actress was only nude as an artistic choice, and it wasn’t meant to be spank bank material.

        Copyright on the other hand, I thought that was just a civil law matter. Wonder why the guy was arrested.

        Edit: Oh shit, good thing that I don’t sail the seven seas, and definitely never has. 2 years ago a Danish guy got 30 days probation for doing less than some of my friends did in high school and university.

        • BigDanishGuy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          I’ve got another addition to my comment, but it does warrant more attention than an edit.

          So I’ve found a Danish language interview on this topic, from earlier this year, so before the man in this story was arrested.

          From https://soundvenue.com/film/2023/05/nej-danske-skuespillere-er-ikke-selv-uden-om-det-naar-deres-sexscener-bliver-delt-paa-reddit-522555

          Generationsforskelle til trods lægger 57-årige Andrea Vagn Jensen og 23-årige Malaika Berenth Mosendane vægt på, at samtykket ryger, når klippene bliver taget ud af kontekst. Det samme fortæller Angela Bundalovic, der for nylig var aktuel i ‘Copenhagen Cowboy’, til DR:

          »Jeg oplever det som en krænkelse, at nogen bruger et klip eller billede af mig i en kontekst, jeg ikke er gået med til. For mig hersker der ingen tvivl om at der bliver begået noget forkert mod skuespillernes frihed og materialets ophavsretshavere«.

          Translated by yours truly:

          Generational differences aside 57yo Andrea Vagn Jensen and 23yo Malaika Berenth Mosendane emphasizes that the consent is lost when the clips are removed from their context. It’s the same for Angela Bundalovic, who recently appeared in ‘Copenhagen Cowboy’, who tells DR [Danish national radio and TV]:

          »I experience it as a violation, when someone uses a clip or still frame in a context, that I haven’t agreed to. In my mind there’s no doubt, that the freedom of the actors is being wronged as well as the copyright holders of the material«.

          The police even has a statement on the matter which I suggest that you run through your preferred translation service https://politi.dk/national-enhed-for-saerlig-kriminalitet/nyhedsliste/39-aarig-anholdt-for-at-dele-noegenscener-fra-film/2024/09/03

      • ABCDE@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        This isn’t Spanish (I speak it), it’s also the same word in Danish according to the dictionary.

    • interdimensionalmeme@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      3 months ago

      It is copyright infringement. As a copyright abolitionist, I don’t care.

      Also, the outrage here is the belief that it is wrong to shown your naked body to a camera and then publishing it.

      And now my warning, people who believe the naked body is shameful and that those who have shown their body have done something wrong. You are a monster, leave my planet before next month, or else …

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      74
      ·
      3 months ago

      Yeah, an arrest for something that would be generally understood to be fair use is a lot.

      I can see the case for “that’s not fair use”. I’m not necessarily convinced either way. But an arrest?

      • protist@mander.xyz
        link
        fedilink
        English
        arrow-up
        35
        ·
        3 months ago

        Looks like this is happening in Denmark, which has different laws than the US’s “fair use.”

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          21
          arrow-down
          4
          ·
          3 months ago

          I get that, but the general understanding of fair use is relatively homogeneous. I’m not saying they shouldn’t be able to take it down, just that an arrest for it when most people’s first guess would be that it’s legal seems harsh.

          • NeoNachtwaechter@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            3 months ago

            I get that, but the general understanding of fair use is relatively homogeneous.

            No, not at all. Only the anglo american culture has that term. Greek/Roman influenced cultures think quite differently about copyright topics in general. African and Asian, I don’t know.

          • General_Effort@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            3
            ·
            3 months ago

            Not at all. The US conserves something of the enlightenment tradition of freely sharing information; vital for the advancement of science, technology, and culture. Free speech and free press means that you can say and print what you like (press at the time literally meant the printing press, not the media). Limitations in the form of copyrights or patents are only allowed where it helps those goals.

            Continental European copyright preserves a monarchical, aristocratic tradition. It’s rooted in ideas of personal privilege and honor. For example, it’s illegal to deface an artwork even when you own it because it’s an attack on the honor of the artist. The term “royalties” comes from the fact that it was a privilege granted by royalty.

            It’s revealing that Europe has basically the same patent system as the US. You can’t do without technology, even if you are an authoritarian ruler. What would your armies do? But copyright is just about culture, usually. You don’t want that to be a needless source of instability. You want a clique of cronies to be in charge of that. That’s what you see in Europe.

        • Gork@lemm.ee
          link
          fedilink
          English
          arrow-up
          8
          ·
          3 months ago

          Has Denmark ever arrested anyone before for copyright infringement? In most other places this is a civil, not a criminal thing.

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      3 months ago

      A porn company could make the same, out-of-context argument just to slap an extra charge onto the arrest. That 2 minutes of badly acted exposition was a critical reason why she’s taking cocks in 3 holes. It’s really a lovely scene you turned into pure filth!

    • 101@reddthat.comOP
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      9
      ·
      edit-2
      3 months ago

      I am very shocked for the whole thing,This is a very bad precedent.

      The man literally did something that the website itself allows by default (by allowing the subreddit) the one at fault here is Reddit, not the man himself. They are the ones who should deal with the consequences.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        3 months ago

        Ehhhhh… that’s a pretty shit take. Any site that allows uploads of files “allows” uploads of CSAM? See how that breaks down immediately?

  • argh_another_username@lemmy.ca
    link
    fedilink
    English
    arrow-up
    54
    ·
    3 months ago

    So, is r/celebnsfw closing? Because it’s exactly what it is. Guy was probably the only human in a sea of bots posting there.

    • burgersc12@mander.xyz
      link
      fedilink
      English
      arrow-up
      10
      ·
      3 months ago

      Its cause hes from Denmark and made their own version of r/watchitfortheplot. Apparently illegal in Denmark. But like its already online, just takes a quick search to find the scenes in question.

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    11
    ·
    3 months ago

    Just FYI, this wasn’t clips of porn. This was actual movies with nude scenes.

    Still not entirely sure how I feel about it, but I do agree it’s not the same thing.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      58
      arrow-down
      1
      ·
      3 months ago

      Assuming he did not upload the whole movie or demanded money for those scenes, I don’t see how that’s a good case for the copyright holder. Movie snippets are used all the time, everywhere, including YouTube, without this being much of an issue. The most glaring one there would be the auto detection, which again, is more to prevent actual piracy being shared.

      Edit: Also, why is he getting arrested instead of getting a letter?!

      • conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        25
        ·
        3 months ago

        Part of the (US) definition of fair use is the impact of the use on the original party. Killing their viewership with a review is still fair use because it’s balanced with the public’s right to a review, but I think there’s a legitimate argument that turning their movie into nothing but a sex object, especially systematically like that, does harm that’s not protected.

    • jacksilver@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      3 months ago

      Why do you not know how to feel about it?

      To me it’s just clip collection, you could have a collection of all death scenes or car cashes. They’re all just clips from videos people agreed to make for public consumption.

      • conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        18
        ·
        edit-2
        3 months ago

        Because in basically any other scenario it’s obvious sexual harassment, and behavior like that is a big part of the reason a lot of actresses aren’t comfortable doing a role where they’re nude to begin with.

        It’s not porn stars who signed up for that content being made into a highlight. It’s an actress who agreed to do a specific scene as part of a movie being treated like they’re doing porn.

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      3 months ago

      Just FYI, this wasn’t clips of porn. This was actual movies with nude scenes.

      Why does that matter? To a horny enough 14 year old there is no difference.

  • General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    4
    ·
    3 months ago

    Rights Alliance also called on Reddit to take the matter seriously. While many of the problematic clips were already removed at that point, the group urged Reddit to implement upload filters to prevent future trouble.

    Nothing sinister here, folks. Just defending helpless women against those evil techbros.

    • Makhno@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      3 months ago

      Nothing sinister here, folks. Just defending helpless women against those evil techbros.

      These are paid actresses. Nothing helpless about it. Clips of film and TV are uploaded all the time. How is this any different?

    • Asafum@feddit.nl
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      3 months ago

      Money. You aren’t paying for the video that the scene is from, you’re just watching the scene.

      It’s almost always about money… Absolutely absurd