Brought to you by the Department of Erasing History.

  • KairuByte@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    1
    ·
    6 months ago

    That’s a ridiculous amount of effort to go through to slow down a scraper for one site, especially when that site could just be… turned off.

    • Saik0A
      link
      fedilink
      English
      arrow-up
      13
      ·
      6 months ago

      If you own the domain you can disable the crawler on it. And remove previous scrapes.

      • KairuByte@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        15
        ·
        edit-2
        6 months ago

        Then it still makes no sense, as you being unable to take down the content means you also very likely can’t edit the content. I can’t think of a situation where you:

        1. Need content to not be scraped
        2. Need time to remove/edit that content
        3. Have access to do the above
        4. Don’t have access to pull the content immediately
        5. Have control of a large enough botnet to take down Internet Archive
        6. Don’t have a big enough botnet to take down the aforementioned content
        • Saik0A
          link
          fedilink
          English
          arrow-up
          7
          ·
          6 months ago

          Well that’s my point… It doesn’t make sense because you can just go after the fact and make the request to take it all down.

          You have to be stupidly paranoid and obscenely stupid to believe that a DDOS is the correct answer if this is the case.

        • r3df0x ✡️✝☪️@7.62x54r.ru
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          On an individual level, having a massive archive of everything you’ve ever posted isn’t always a good thing, especially when mentally ill people will quote mine a single post and then try to misuse it.