• FooBarrington@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    9 months ago

    What are you trying to imply? That training Transformer models necessarily needs to be a continuous process? You know it’s pretty easy to stop and continue training, right?

    I don’t know why people keep commenting in spaces they’ve never worked in.

    • guacupado@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      9 months ago

      No datacenter is shutting off of a leg, hall, row, or rack because “We have enough data, guys.” Maybe at your university server room where CS majors are interning. These things are running 24/7/365 with UU tracking specifically to keep them up.

      • FooBarrington@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 months ago

        What are you talking about? Who said anything close to “we have enough data, guys”?

        Are you ok? You came in with a very snippy and completely wrong comment, and you’re continuing with something completely random.