• Stolen_Stolen_Valor [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    4 days ago

    The “AI” is effectively just autocomplete powered by the internet. It could by powered by your 2001 flip phone probably. The whole thing is smoke and mirrors, hype, and snake oil bought by people who don’t understand what’s happening or people only concerned with line go up.

    • It could by powered by your 2001 flip phone probably

      LLMs are fundamentally billion-dimensional logistic regressions that require massive context windows and training sets. It is difficult to create a more computationally expensive system than an LLM for that reason. I have a fairly nice new laptop, and it can barely run Deepseek-r1:14b (14 billion parameter model. Not technically the same model as deepseek-r1:671b as it is a fine-tune of qwen-2.5:14b that uses the deepseek chain reasoning. It can run the 7b model fine, however. There isn’t a single piece of consumer-grade hardware capable of running the full 671b model.