• HowManyNimons@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    6 months ago

    You’ve clearly never lived with a cat. Your metaphor is crushed by the Kitty Expansion Theory: No piece of furniture is large enough for a cat and any other additional being.

  • teft@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    6 months ago

    Just like the human eye can only register 60fps and no more, your computer can only register 4gb of RAM and no more. Anything more than that is just marketing.

    Fucking /S since you clowns can’t tell.

    • MonkderDritte@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Jokes on you, because i looked into this once. I don’t know the exact ms the light-sensitive rods in human eyes need to refresh the chemical anymore but it resulted in about 70 fps, so about 13 ms i guess (the color-sensitive cones are far slower). But psycho-optical effects can drive that number up to 100 fps in LCD displays. Though it looks like you can train yourself with certain computer tasks to follow movements with your eye, being far more sensible to flickering.

      • iopq@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        It’s not about training, eye tracking is just that much more sensitive to pixels jumping

        You can immediately see choppy movement when you look around in a 1st person view game. Or if it’s an RTS you can see the trail behind your mouse anyway

        I can see this choppiness at 280 FPS. The only way to get rid of it is to turn on strobing, but that comes with double images at certain parts of the screen

        Just give me a 480 FPS OLED with black frame insertion already, FFS

        • MonkderDritte@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          Well, i do not follow movements (jump to the target) with my eyes and see no difference between 30 and 60 FPS, run comfortably Ark Survival on my iGPU at 20 FPS. And i’m still pretty good in shooters.

          Yeah, it’s bad that our current tech stack doesn’t allow to just change image where change happens.

      • SorryQuick@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        According to this study, the eye can see a difference as high as 500 fps. While this is a specific scenario, it’s a scenario that could possibly happen in a video game, so I guess it means we can go to around 500 hz monitors before it becomes too much or unnessessary.

    • TheRedSpade@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      This is only true if you’re still using a 32 bit cpu, which almost nobody is. 64 bit cpus can use up to 16 million TB of RAM.

  • BCsven@lemmy.ca
    link
    fedilink
    arrow-up
    2
    ·
    6 months ago

    My 2010 arm board with 256MB ram running openmediavault and minidlna for music streaming. Still lots of RAM left.

      • refalo@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        horrible take IMO. firefox is using 12GB for me right now, but you have no idea how many or what kind of tabs either of us have, which makes all the difference to the point your comment has no value whatsoever.

  • umbraroze@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    About 10 years ago I was like “FINE, clearly 512MB of memory isn’t enough to avoid swapping hell, I’ll get 1 GB of extra memory.” …and that was that!

    These days I’m like “4 GB on a single board computer? Oh that’s fine. You may need that much to run a browser. And who’s going to run a browser regularly on a SBC? …oh I’ve done it a lot of times and it’s… fine.”

    The thing I learned is that you can run a whole bunch of SHIT HOT server software on a system with less than a gigabyte of memory. The moment you run a web browser? FUCK ALL THAT.

    And that’s basically what I found out long ago. I had a laptop that had like 32 megs of memory. Could be a perfectly productive person with that. Emacs. Darcs. SSH over a weird USB Wi-Fi dongle. But running a web browser? Can’t do Firefox. Opera kinda worked. Wouldn’t work nowadays, no. But Emacs probably still would.

  • xia@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    Just wait till all the browser tabs sit down, and need to swap to the floor.

    • rickyrigatoni@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      I genuinely can’t imagine having more than 7 tabs open. I can barely keep track of that many. How do you do it, wisened mistrel of the woods?

      • Captain Janeway@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        For me it’s a pattern of “Ctrl+t” to open a new tab and then I search “my interesting query”. After that, I use “shift+tab” or “Ctrl+shift+tab” to navigate between tabs. Rinse and repeat until I get tired.

        I don’t like searching in my current tab because I don’t want to lose the info I have.

  • NutWrench@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    If that picture was of a Windows installation, Windows would be a Sumo Wrestler instead of a kitten.

  • TechNerdWizard42@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Current 4 year old laptop with 128GB of ECC RAM is wonderful and is used all the time with simulations, LLMs, ML modelling, and the real heavy lifter, Google Chrome.

  • Kevin@lemmy.worldM
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I was running out of RAM on my 16GB system for years (just doing normal work tasks), so I finally upgraded to a new laptop with 64GB of RAM. Now I never run out of memory.

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Much like a cat can stretch out and somehow occupy an entire queen-sized bed, Linux will happily cache your file system as long as there is available memory.

    • MonkderDritte@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      6 months ago

      Note for the “unused RAM is wasted RAM” people, in the description of earlyoom:

      Why is “available” memory checked as opposed to “free” memory? On a healthy Linux system, “free” memory is supposed to be close to zero, because Linux uses all available physical memory to cache disk access. These caches can be dropped any time the memory is needed for something else.

      So yeah, there’s a difference.