It scans for viruses inside the archive, which takes longer than the 5 minutes interval before it spawns a new maintenance task, which scans inside the archive while the previous task is still scanning...
Just like the human eye can only register 60fps and no more, your computer can only register 4gb of RAM and no more. Anything more than that is just marketing.
This is only true if you’re still using a 32 bit cpu
Bank switching to "fake" the ability to access more address space was a big thing in the 80s...so it's technically possible to access addresses that are wider than the address bus by dividing it up into portions that it can see.
Jokes on you, because i looked into this once. I don't know the exact ms the light-sensitive rods in human eyes need to refresh the chemical anymore but it resulted in about 70 fps, so about 13 ms i guess (the color-sensitive cones are far slower). But psycho-optical effects can drive that number up to 100 fps in LCD displays. Though it looks like you can train yourself with certain computer tasks to follow movements with your eye, being far more sensible to flickering.
It's not about training, eye tracking is just that much more sensitive to pixels jumping
You can immediately see choppy movement when you look around in a 1st person view game. Or if it's an RTS you can see the trail behind your mouse anyway
I can see this choppiness at 280 FPS. The only way to get rid of it is to turn on strobing, but that comes with double images at certain parts of the screen
Just give me a 480 FPS OLED with black frame insertion already, FFS
Well, i do not follow movements (jump to the target) with my eyes and see no difference between 30 and 60 FPS, run comfortably Ark Survival on my iGPU at 20 FPS. And i'm still pretty good in shooters.
Yeah, it's bad that our current tech stack doesn't allow to just change image where change happens.
According to this study, the eye can see a difference as high as 500 fps. While this is a specific scenario, it’s a scenario that could possibly happen in a video game, so I guess it means we can go to around 500 hz monitors before it becomes too much or unnessessary.
Yeah, but when it comes to RAM and Storage, the other golden rule is that the longer you delay your upgrade the cheaper it will be (assuming you'll even need it) or the more you can get for the same money.
It's great that the system is so efficient. But things do come up. I once worked with an LSP server that was so hungry that I had to upgrade from 32 to 64gb to stop the OOM crashes. (Tbf I only ran out of memory when running the LSP server and compiler at the same time - but hey, I have work to do!) But now since I'm working in a different area I'm just way over-RAMed.
I was running out of RAM on my 16GB system for years (just doing normal work tasks), so I finally upgraded to a new laptop with 64GB of RAM. Now I never run out of memory.