Damn, the things used to be these thin little, well, cards. Nowadays they are reaching the size of entire consoles and can more accurately be called graphics bricks. Is the tech so stagnant that they won’t be getting smaller again in the future?

The high end ones are so huge, power hungry, and fucking expensive that I’m starting to think they might as well just come with an integrated CPU and system RAM (in addition to the VRAM) on the same board.

What is the general industry expectation of what GPUs are going to be like in the mid term future, maybe 20 to 30 years from now? I expect if AI continues to grow in scope and ubiquity, then a previously unprecedented amount of effort and funding is going to be thrown at R&D for these PC components that were once primarily relegated to being toys for gamers.

  • unperson [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    9 months ago

    Yes, it was the cheapest graphics card that could decode 1080p H.264 video in real time (and the acceleration worked in the Flash player). The 8500 GT could also do it but it was never popular. It made a huge difference when youtube became a thing.

    • zed_proclaimer [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      According to cosecantphi below who opened a cheap low end office computer from 2008, it had integrated graphics and not a dedicated graphics card

      I find it extremely hard to believe that schools and libraries and whatnot were building PC towers with dedicated graphic cards like 9500 GT, that was an exclusively gamer/performance nerd thing to do

      • unperson [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        Of course you had to have something to drive the VGA outputs. Usually this meant a VIA, SiS, or Unichrome chip in the motherboard. Those chips often had no 3D acceleration at all, and a max resolution of 1280x1024. You were lucky to have shaders instead of fixed-function pipelines in 2008-era integrated graphics, and hardware accelerated video decoding was unheard of. The best integrated GPUs were collaborations with nVidia that basically bundled a GPU with the mainboard, but those mainboards were expensive.

        Windows Vista did not run well at all on these integrated chips, but nobody liked Windows Vista so it didn’t matter. After Windows 7 was released, Intel started bundling their “HD Graphics” on CPUs and the on-die integrated GPU trend got started. The card in the picture belongs to the interim time where the software demanded pixel shaders and high-resolution video but hardware couldn’t deliver.

        They left a lot of work for the CPU to do: if you try to browse hexbear on them you can see the repainting going from top bottom as you scroll. You can’t play 720p video and do anything else with the computer at the same time, because the CPU is pegged. But if you put the 9500 GT on them then suddenly you can use the computer as a HTPC. It was not an expensive card, it was 60-80 USD, and it was a logical upgrade to a tower PC you already have to make it more responsive and enable it to play HD video.