• Hackworth@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    2 months ago

    My money’s on analog optical computing replacing GPUs as the hardware used for inference / generation. Analog computing in general is promising, and I look forward to seeing how the chips fall re: training.

  • Anivia
    link
    fedilink
    Deutsch
    arrow-up
    8
    ·
    2 months ago

    I can’t tell if this article is satire or not

  • Em Adespoton@lemmy.ca
    link
    fedilink
    arrow-up
    15
    arrow-down
    10
    ·
    2 months ago

    Both the article and the pushback are kind of silly here — the dGPU’s heyday was over a decade ago, back when “serious gamers” had a custom built PC on their desk and upgraded their GPU every two years at a minimum.

    Back in 2008, gaming on a laptop started to become a possibility, and dGPUs were part of that story — but for the most part, good luck swapping out your GPU for a newer model; it generally wasn’t so easy to do on a laptop.

    THAT was the beginning of the end for dGPUs.

    By 2015, I had a laptop with both an iGPU and a dGPU. eGPUs were just appearing on the market as a way around the lack of upgradeability, but these were niche, and not required for most computing tasks, including gaming.

    At the same time, console hardware began to converge with desktop hardware, so gaming houses, who had for over 20 years driven the dGPU market, fell into a slower demand cadence that matched the console hardware. GPUs stagnated.

    And then came cryptomining, a totally new driver of GPUs. And it almost destroyed the market, gobbling up the hardware so that none was available for any other compute task.

    Computer designers responded by doubling down on the iGPU, making them good enough for almost all tasks you’d use a personal computer for.

    Then came AI. It too was a new driver for GPUs, and like crypto, sucked some of the oxygen out of the PC market… which switched to adding iNPUs to handle ML tasks.

    So yeah; GPUs are now for the cloud services market and niche developers; everyone else can get their hands on a “good enough” SoC with enough CPU, GPU and NPU compute to do what they need, and the ability to offload to a remote server cluster for weightier jobs.

      • jacksilver@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        Yeah, dgpus have been for niche applications for decades, I didn’t read the article, but the parent comment is vastly overestimating igpu capabilites

      • Em Adespoton@lemmy.ca
        link
        fedilink
        arrow-up
        6
        arrow-down
        13
        ·
        2 months ago

        Show me non-niche software that needs more than a modern iGPU can provide. Your 3080ti can blast two screens of 4k video at 120fps HDR… and so can my iGPU.

          • yonder@sh.itjust.works
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            2 months ago

            Literally Minecraft and Fortnite will become much more enjoyable with a dGPU. That seems very not niche to me.

          • beliquititious@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            1
            arrow-down
            5
            ·
            2 months ago

            Yeah but an xbox’s gpu can play games too. You don’t need a 3080ti to play any game on the market. Dedicated gpus are almost entirely a luxury upgrade with the power of today’s iGPU’s.

            • unmagical@lemmy.ml
              link
              fedilink
              arrow-up
              7
              arrow-down
              1
              ·
              2 months ago

              My 3080ti significantly out performs an Xbox. While you can game on a console you can game better on a PC with a dGPU. An iGPU will get the job done, but a dGPU today continues to outperform it and give you a better experience. I can play across 3 2k displays at 165Hz, or step up to native 4k, I can smooth framerates with raytracing on at a non upscaled resolution.

              • beliquititious@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                2
                arrow-down
                6
                ·
                2 months ago

                Cool? That level of performance is incredibly niche and not required to play any game. Maximal theoretical performance is one way to play, but not even a majority of PC gamers have that kind of hardware and no one needs it (for gaming at least). The only area where you need power to participate is VR, but stand alone sets run on phone hardware.

                The performance differences between an Xbox, a laptop with a good igpu, and a $3k gaming rig doesn’t matter if your top priority is having fun playing a game and not tinkering with specs and hardware.

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 months ago

      over a decade ago, back when “serious gamers” had a custom built PC on their desk and upgraded their GPU every two years at a minimum.

      Hey man, I’m old, but I still have a custom built gaming PC on my desktop and a fairly recent GFX card (3070 ti). Although I’d say I only update my card maybe once every 3-5 years depending on necessity.

  • pantherina
    link
    fedilink
    Deutsch
    arrow-up
    1
    arrow-down
    5
    ·
    2 months ago

    Ich hab mir ne NVidia Titan X gekauft für LLMs, und hab das Gefühl dass das ein absoluter Fehlkauf war.

    Der Energieverbrauch muss absurd sein, moderne Chips sind bestimmt extrem viel besser.

    Es war halt so 1000€ günstiger, glaube ich. Vielleicht bin ich da aber auch falsch