• Diabolo96@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    10 months ago

    for consolation, at least an AI wouldn’t crack a joke with his buddies about doing a strike after bombing a whole neighborhood, a school and a hospital.

    Notice : I didn’t read the article. My internet is slower than a snail going backwards on a ramp so i couldn’t read it

    • joelfromaus@aussie.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      Turns out they used existing pilot information to train the AI so it’ll still bomb a supermarket and make a “fire sale” joke.

      P.s. I also didn’t read the article.

    • Anony Moose@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      Reminded me of the very campy movie Stealth where an AF AI plane/drone goes rogue. I hope they have lightning strike protection on these things!

  • khalic@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    10 months ago

    I see the arms industry is the latest to use the AI frenzy to sell some more products

  • Send_me_nude_girls@feddit.de
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    10 months ago

    I’m pro AI, I see a lot potential in it. I actively use AI daily by now, be it YT algorithm, AI image generation or chatbots for a quick search. But full automated AI with access to weapons should never be a thing. It’s like giving a toddler a sharp knife and you never know who gets stabbed. Sure humans do error, but humans also are more patient and will stop if they aren’t sure. It’s better to have a pilot not bomb the house instead of accidentally bombing a playground with kids, because the AI had a hickup.