The extra space is for two Electron apps of your choice.
Let’s start with one and see how it goes.
discord and microsoft teams 😍
You picked two of the crappiest apps ever.
That’s the point
Teams in browser is okay
You’ve clearly never lived with a cat. Your metaphor is crushed by the Kitty Expansion Theory: No piece of furniture is large enough for a cat and any other additional being.
Caching be like
Caching do indeed be like.
The kitty expansion theory is incomplete, any piece of furniture is large enough for both a cat and an additional being provided the additional being was there first
Just install Chrome or Firefox. Problem solved.
weak. compile them
Yup I max out 32GB building librewolf from source
compile in tmpfs
I compile them in swap and swap is of course Google Drive
and a vm or 2
Just like the human eye can only register 60fps and no more, your computer can only register 4gb of RAM and no more. Anything more than that is just marketing.
Fucking /S since you clowns can’t tell.
Human eye can’t see more than 1080p anyway, so what’s the point
It doesn’t matter honestly, everyone knows humans can’t see screens at all
It honestly doesn’t matter, reality only exists in your imagination anyway.
Jokes on you, because i looked into this once. I don’t know the exact ms the light-sensitive rods in human eyes need to refresh the chemical anymore but it resulted in about 70 fps, so about 13 ms i guess (the color-sensitive cones are far slower). But psycho-optical effects can drive that number up to 100 fps in LCD displays. Though it looks like you can train yourself with certain computer tasks to follow movements with your eye, being far more sensible to flickering.
Does that refresh take place across the entire eye simultaneously or is each rod and/or cone doing its own thing?
There’s a neuron layer trimming data down to squeeze it through the optical nerve, so… no clue.
Are your eyeballs progressive scan or interlaced, son?
It’s not about training, eye tracking is just that much more sensitive to pixels jumping
You can immediately see choppy movement when you look around in a 1st person view game. Or if it’s an RTS you can see the trail behind your mouse anyway
I can see this choppiness at 280 FPS. The only way to get rid of it is to turn on strobing, but that comes with double images at certain parts of the screen
Just give me a 480 FPS OLED with black frame insertion already, FFS
Well, i do not follow movements (jump to the target) with my eyes and see no difference between 30 and 60 FPS, run comfortably Ark Survival on my iGPU at 20 FPS. And i’m still pretty good in shooters.
Yeah, it’s bad that our current tech stack doesn’t allow to just change image where change happens.
According to this study, the eye can see a difference as high as 500 fps. While this is a specific scenario, it’s a scenario that could possibly happen in a video game, so I guess it means we can go to around 500 hz monitors before it becomes too much or unnessessary.
This is only true if you’re still using a 32 bit cpu, which almost nobody is. 64 bit cpus can use up to 16 million TB of RAM.
Sorry I forgot to put my giant /s.
My 2010 arm board with 256MB ram running openmediavault and minidlna for music streaming. Still lots of RAM left.
The other 28GB is for running chrome
One of the reasons I use Firefox.
horrible take IMO. firefox is using 12GB for me right now, but you have no idea how many or what kind of tabs either of us have, which makes all the difference to the point your comment has no value whatsoever.
4GB of RAM: load a model into llama.cpp
Explodes
Apple be like: our 4gb is like 16gb from others
That’s right. Prize wise
About 10 years ago I was like “FINE, clearly 512MB of memory isn’t enough to avoid swapping hell, I’ll get 1 GB of extra memory.” …and that was that!
These days I’m like “4 GB on a single board computer? Oh that’s fine. You may need that much to run a browser. And who’s going to run a browser regularly on a SBC? …oh I’ve done it a lot of times and it’s… fine.”
The thing I learned is that you can run a whole bunch of SHIT HOT server software on a system with less than a gigabyte of memory. The moment you run a web browser? FUCK ALL THAT.
And that’s basically what I found out long ago. I had a laptop that had like 32 megs of memory. Could be a perfectly productive person with that. Emacs. Darcs. SSH over a weird USB Wi-Fi dongle. But running a web browser? Can’t do Firefox. Opera kinda worked. Wouldn’t work nowadays, no. But Emacs probably still would.
Someone clearly doesn’t play Cities: Skylines with mods
Just wait till all the browser tabs sit down, and need to swap to the floor.
I genuinely can’t imagine having more than 7 tabs open. I can barely keep track of that many. How do you do it, wisened mistrel of the woods?
For me it’s a pattern of “Ctrl+t” to open a new tab and then I search “my interesting query”. After that, I use “shift+tab” or “Ctrl+shift+tab” to navigate between tabs. Rinse and repeat until I get tired.
I don’t like searching in my current tab because I don’t want to lose the info I have.
If that picture was of a Windows installation, Windows would be a Sumo Wrestler instead of a kitten.
Current 4 year old laptop with 128GB of ECC RAM is wonderful and is used all the time with simulations, LLMs, ML modelling, and the real heavy lifter, Google Chrome.
More is more.
I was running out of RAM on my 16GB system for years (just doing normal work tasks), so I finally upgraded to a new laptop with 64GB of RAM. Now I never run out of memory.
lol, you wish.
Much like a cat can stretch out and somehow occupy an entire queen-sized bed, Linux will happily cache your file system as long as there is available memory.
Note for the “unused RAM is wasted RAM” people, in the description of earlyoom:
Why is “available” memory checked as opposed to “free” memory? On a healthy Linux system, “free” memory is supposed to be close to zero, because Linux uses all available physical memory to cache disk access. These caches can be dropped any time the memory is needed for something else.
So yeah, there’s a difference.