• TheDoctor [they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 months ago

    I hope for everyone’s sake that he knows he’s lying. There will indeed be further optimizations in AI computation energy efficiency. There will eventually be ASICs for training models which run a non-standard form of floating point representation which is optimized for LLM training. Those will be more energy efficient. But the idea that LLMs or any near-future iteration on them will be the catalyst for those optimizations is nonsense.

      • TheDoctor [they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 months ago

        Even more delusional the . There are really intensive bits of computer science dedicated to manipulative mathematical symbols and solving advanced maths. They don’t fall under the umbrella of machine learning and no amount of GPU cores will changes that.

      • Krem [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        oh cool he wants a Culture Mind to organize human society. too bad our “ai” is just machine learning and no actual intelligence