deleted by creator
my tandy sensation didn’t need more than 4mb of ram
If your Linux is not using 99% of RAM, then it’s misconfigured.
Got it. Removing RAM modules now.
Thanks for giving a genuine smile :)
Just like the human eye can only register 60fps and no more, your computer can only register 4gb of RAM and no more. Anything more than that is just marketing.
This is only true if you’re still using a 32 bit cpu, which almost nobody is. 64 bit cpus can use up to 16 million TB of RAM.
Sorry I forgot to put my giant /s.
Jokes on you, because i looked into this once. I don’t know the exact ms the light-sensitive rods in human eyes need to refresh the chemical anymore but it resulted in about 70 fps, so about 13 ms i guess (the color-sensitive cones are far slower). But psycho-optical effects can drive that number up to 100 fps in LCD displays. Though it looks like you can train yourself with certain computer tasks to follow movements with your eye, being far more sensible to flickering.
According to this study, the eye can see a difference as high as 500 fps. While this is a specific scenario, it’s a scenario that could possibly happen in a video game, so I guess it means we can go to around 500 hz monitors before it becomes too much or unnessessary.
It’s not about training, eye tracking is just that much more sensitive to pixels jumping
You can immediately see choppy movement when you look around in a 1st person view game. Or if it’s an RTS you can see the trail behind your mouse anyway
I can see this choppiness at 280 FPS. The only way to get rid of it is to turn on strobing, but that comes with double images at certain parts of the screen
Just give me a 480 FPS OLED with black frame insertion already, FFS
Well, i do not follow movements (jump to the target) with my eyes and see no difference between 30 and 60 FPS, run comfortably Ark Survival on my iGPU at 20 FPS. And i’m still pretty good in shooters.
Yeah, it’s bad that our current tech stack doesn’t allow to just change image where change happens.
Does that refresh take place across the entire eye simultaneously or is each rod and/or cone doing its own thing?
Are your eyeballs progressive scan or interlaced, son?
Human eye can’t see more than 1080p anyway, so what’s the point
Human eye can’t see more than 8-bit colors anyway, so what’s the point
It doesn’t matter honestly, everyone knows humans can’t see screens at all
Their vision is based on movement.
It honestly doesn’t matter, reality only exists in your imagination anyway.
That’s not sarcasm, it’s misinformation. Not surprising that people downvoted you even though it was just a joke.
I don’t think that somebody actually read that computers can’t register more then 4GiB of RAM and then thought
That’s totally true, because u/teft said it is
It certainly used to be true, in the era of 32 bit computers.
That’s what makes it a joke. Does anyone here unironcally think the human eye can only see 60 fps or that more than 4 gigs of ram is just marketing?
Just install Chrome or Firefox. Problem solved.
and a vm or 2
weak. compile them
Yup I max out 32GB building librewolf from source
compile in tmpfs
I compile them in swap and swap is of course Google Drive
Nature finds a way. My Fedora 39 box with 32GB rests at like 4-ish and goes to 8-ish with a browser open.
This is my server and about 28GB sits unused. Just in case I might want to run a new VM or something… 🤣
Just put a big archive in your nextcloud with default config, your server will be wheezing in no time.
Does it unpack the archive in-memory? In the newest stable version?
It scans for viruses inside the archive, which takes longer than the 5 minutes interval before it spawns a new maintenance task, which scans inside the archive while the previous task is still scanning…
Lol
That’s why I have 16GB on my main pc. The highest I’ve ever seen it was 8GB while playing Alan wake 2
I go over 16 gigs regularly browsing the internet.
You must have a ton of tabs open.
windows™️
Always see my system chilling at 5 or 6 gb
This just means you’re future proofed
You’ve clearly never lived with a cat. Your metaphor is crushed by the Kitty Expansion Theory: No piece of furniture is large enough for a cat and any other additional being.
Caching be like
Caching do indeed be like.
The kitty expansion theory is incomplete, any piece of furniture is large enough for both a cat and an additional being provided the additional being was there first
My cat would just extend perpendicular to the length of my bed so i have enough space to decide to sit on one of the two remaining sides of the bed.
Exactly. That kitty encompasses and rules over aaaalllll that couch. Surfaces and interior volume (as soon as he discovers it). No room for anybody else. Just ask him.
Android studio: *big fat cat in the middle of the sofa"
Work gave me a 16gb laptop for Android development.
It took up to 20 minutes to incrementally compile.
They eventually bumped me up to 32gb when I complained enough that my swap file was 20gb.
Suddenly incremental compiles are <2 min
I was running out of RAM on my 16GB system for years (just doing normal work tasks), so I finally upgraded to a new laptop with 64GB of RAM. Now I never run out of memory.
lol, you wish.
I actually downgraded my Laptop from 16 to 8 gig DDR3L and did not spot a difference
Just using local llama takes 32GB ram
depends on quantization