The echoes of Y2K resonate in today’s AI landscape as executives flock to embrace the promise of cost reduction through outsourcing to language models.

However, history is poised to repeat itself with a similar outcome of chaos and disillusionment. The misguided belief that language models can replace the human workforce will yield hilarious yet unfortunate results.

  • Barx [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    17
    ·
    5 months ago

    These models perform quite poorly when it comes to actually increasing employee productivity. Therefore, these attempts to incorporate it into businesses fall into two camps:

    • The incompetent bazinga business tyrants that are chasing a false promise and aren’t savvy enough to recognize when they’re actually hurting productivity or don’t care because they see more value in marketing themselves as “using AI” than in actually doing better because of it.

    • The competent MBAs and other finance ghouls that know it sucks but know it’s useful for disciplining labor and changing the composition of their workforce to be more “replaceable”, i.e. proletarianized (even if they don’t know that word).

    We all know examples of the former. Business owners and middle managers are often incompetent and very full of themselves and make very bazinga decisions.

    But the latter is the trend. They’re what the big businesses are listening to, the big monopolies that really control production. They don’t actually care that much if their per-employee productivity goes down a little right now so long as they can rapidly make it up in the form of using a larger, cheaper labor pool and reduced turnover, hiring, and HR costs. But to get that pool you have to change the composition of your workforce: you have to fire the more expensive people (e.g. a senior dev) and use that savings to hire, say, a junior dev and pretend that’s just as good.

    This will and does lead to catastrophe of course. Not just economic overall but even at the individual business level, the stuff they’re allowing these LLMs to do are creating huge liabilities like “AIs” that make up company policies and refunds or “AIs” that produce bad code that gets poorly reviewed and makes its way to production because of their “new process” (having fewer developers).

    • CantaloupeAss [comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 months ago

      Unfortunately, so many companies are so totally useless and bazinga-ed to their very core that an autogenerating tool really can spit out similar drivel to whatever stupid shit they were going to pay someone to write / do

  • GaveUp [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    16
    ·
    5 months ago

    It’s fucking absurd, their last earnings was a 12 billion profit at a 50% profit margin…

    For comparison, other monopolies like Microsoft, Meta, Google, and Apple are only about 25-30%

    All these GPUs for shitty genAI crap that does almost nothing but make people unemployed and homeless