• skozzii@lemmy.ca
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    1
    ·
    2 months ago

    It should be criminal that so many laptops they sold the past few years have 8gb of ram.

    All of those people will be looking to upgrade within a year, which is the plan…

    • ArchRecord@lemm.ee
      link
      fedilink
      English
      arrow-up
      25
      ·
      2 months ago

      Apple just can’t resist making ridiculous margins from their customers.

      For instance, with a Mac Pro, you have to pay an extra $800 to go from 64gb to 128gb of memory. For $800, you could get about 384gb of ram in 64gb sticks from a different vendor.

    • DJDarren@thelemmy.club
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 months ago

      I know it’s an unpopular opinion in these parts, but honestly, 8gb is often fine for the people who are buying entry level machines.

      I use a 2014 Mac mini for work, which has 8gb of soldered RAM. For sure I’d increase it to 16gb if I could, but I honestly can’t say I have any issues with it. It’s running Sonoma via OCLP like a champ.

      But yeah, what Apple charge for RAM is downright criminal.

    • zante@lemmy.wtf
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      19
      ·
      edit-2
      2 months ago

      What’s the different between you shilling ram today and them shilling upgrades tomorrow ?

      • SendMePhotos@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 months ago

        The software of today is more demanding than the hardware. You have to increase the hardware to have a good user experience.

        The upgrades that they are shilling are neither hardware nor software upgrades. They are just an addition of Learning Language Models (LLMs) that only give information based on statistical data and sentence structures. This is why many of the AI chat bots out there are extremely unreliable.

        The integration of Ai into user devices is, simply, a waste of user storage, hardware, processing power, and bogs down a system that is already running at capacity due to the hardware limitations (not increasing the standard RAM amount).

        • felsiq@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 months ago

          8gb of ram is definitely not ideal for the computer lifetimes people expect from macs, even for a very basic user, but it’s not too low to be forgivable except for the fact that modern macs use a SOC design and can’t upgrade the “RAM” (I know it’s not conventional ram; still gonna call it that). That aside, assuming “they” is still apple, the majority of what you said is… not correct. I’m gonna try to reply to each point without being a dick, but I’m sorry in advance in case it comes off that way or if it goes way too long. In order:

          The average Mac owner’s use cases are nowhere near too demanding for the hardware - ever since they stopped trying to cram inefficient intel cores into a tiny chassis with the world’s shittiest cooling (2020), macs have been significantly more powerful than the average user needs in the short term. Someone who’s only trying to run some safari/firefox tabs, iMessage, a music client, and maybe a document or spreadsheet editor at most isn’t gonna be held back by the hardware of today at all - shit, 2020’s original base M1 macbook air with no fans would still be chugging along just fine today with that workload. On the off time a user like that does max out their ram (chrome with a million tabs, or if they’ve got a lot open and try the new apple intelligence stuff) modern ssds are fast enough that bumping a program into swap space doesn’t make the UI take a year like on HDDs. Should there still be more than 8gb ram on a computer (theoretically) designed to last 8+ years? Ideally, yes, but it’s really not the dealbreaker (again, for the average use case) that people make it out to be - it’s not gonna suddenly turn a new mac into a steaming pile of shit on year 3 or something.

          About upgrades, I’m not really sure how to address this - what upgrades are just adding LLMs? Whether you’re talking computers or phones, I can’t remember an upgrade cycle for either in ages that hasn’t been double digit power increases. Software-wise, none of the upcoming software updates are “just” AI stuff - ios 18 adds a bunch of cool shit and while I don’t follow or care about Mac software, I’m sure a lot of that made its way over there too. This part is a little pedantic (please don’t take it as me being an asshole lol, zero hostility I promise), but I also wanna note it’s not just LLMs - they’ve got multimodal models for images and video too.

          Your last point is subjective so I won’t try to claim your opinion (other than the bit about modern hardware running at capacity) is wrong. I do wanna offer a counterpoint tho, because while I agree that AI is overhyped and a lot of what companies are bragging about is mostly fluff (fucking genmoji???), there’s some tangible ways it’s gonna improve user experience. A more flexible Siri is probably gonna be the most-used one, since needing to be perfectly explicit and clear about what you want Siri to do is probably its biggest problem rn. An LLM backend will let it look past a badly phrased request or a stutter to the actual meaning of what you were trying to say, which is gonna make telling someone you’re about to get there while driving so much less painful.
          The one I’m personally most excited about is one of the multimodal AI capabilities - fuck the image generation/editing (fun but overhyped imo), semantic media searches (searching photos for “mom and dad in front of that one waterfall”) are such a game changer, and the idea that I can have that without sending my photos and contacts to some external server is so wild to me.

          Anyway, not trying to argue that 8gb of ram is a good design choice or force you to like AI, but I’m pretty into cpu/gpu/SOC advances and couldn’t just let them be slandered