• archomrade [he/him]@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      The energy expenditure for GPT models is basically a per-token calculation. Having it generate a list of 3-4 token responses would barely be a blip compared to having it read and respond entire articles.

      There might even be a case for certain tasks with a GPT model being more energy efficient than making multiple google searches for the same. Especially considering all the backend activity google tacks on for tracking users and serving ads, complaining about someone using a GPT model for something like generating a list of words is a little like a climate activist yelling at someone for taking their car to the grocery store while standing across the street from a coal-burning power plant.