• NigelFrobisher@aussie.zone
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    8 hours ago

    At a beach restaurant the other night I kept hearing a loud American voice cut across all conversation, going on and on about “AI” and how it would get into all human “workflows” (new buzzword?). His confidence and loudness was only matched by his obvious lack of understanding of how LLMs actually work.

      • ameancow@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        31 seconds ago

        I would also add “hopeful delusionals” and “unhinged cultist” to that list of labels.

        Seriously, we have people right now making their plans for what they’re going to do with their lives once Artificial Super Intelligence emerges and changes the entire world to some kind of post-scarcity, Star-Trek world where literally everyone is wealthy and nobody has to work. They think this is only several years away. Not a tiny number either, and they exist on a broad spectrum.

        Our species is so desperate for help from beyond, a savior that will change the current status-quo. We’ve been making fantasies and stories to indulge this desire for millenia and this is just the latest incarnation.

        No company on Earth is going to develop any kind of machine or tool that will destabilize the economic markets of our capitalist world. A LOT has to change before anyone will even dream of upending centuries of wealth-building.

      • AItoothbrush@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 hours ago

        AI itself too i guess. Also i have to point this out every time but my username was chosen way before all this shit blew up into our faces. Ive used this one on every platform for years.

    • Chaotic Entropy@feddit.uk
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 hours ago

      Some people can only hear “AI means I can pay people less/get rid of them entirely” and stop listening.

      • anon_8675309@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        AI means C level jobs should be on the block as well. The board can make decisions based on their output.

        • Knock_Knock_Lemmy_In@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          The whole ex-Mckinsey management layer is at risk. Whole teams of people who were dedicated to producing pretty slides with “action titles” for managers higher up the chain to consume and regurgitate are now having their lunch eaten by AI.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 hours ago

      I’ve noticed that the people most vocal about wanting to use AI get very coy when you ask them what it should actually do.

    • Zement@feddit.nl
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      7 hours ago

      I really like the idea of an LLM being narrowly configured to filter, summarize data which comes in at a irregular/organic form.

      You would have to do it multiples in parallel with different models and slightly different configurations to reduce hallucinations (Similar to sensor redundancies in Industrial Safety Levels) but still, … that alone is a game changer in “parsing the real world” … that energy amount needed to do this “right >= 3x” is cut short by removing the safety and redundancy because the hallucinations only become apparent down the line somewhere and only sometimes.

      They poison their own well because they jump directly to the enshittyfication stage.

      So people talking about embedding it into workflow… hi… here I am! =D

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        A buddy of mine has been doing this for months. As a manager, his first use case was summarizing the statuses of his team into a team status. Arguably hallucinations aren’t critical