TheNamelessOne_PL: I don't mean to come across as being snarky or patronizing or anything like that, but I am just quite a bit baffled.
How are people still even using Windows 7? Quite easily. I have two rigs, one is using W10, the other dual-booting W7 & Linux. Despite being "old", if you do not own any DX12 exclusive games (and I don't) there is literally no functional difference at all for a gaming rig for my entire collection of +2,000 games. Nor has there been any difference in security either. 99% of the 70,000 PC games using DirectX 5-11, OpenGL or Vulkan API's work flawlessly, which just goes to show how little the "core" of Windows has changed, and how much W8, W10, W11, etc, is more about endless unwanted UI makeovers than some radical under the hood changes. I have simply noticed zero practical difference in gaming on the W10 vs W7 computers, even today in 2022. The only games that need DX12 exclusively, ie, it's DX12 or nothing with no alternative Vulkan / DX11 / OpenGL renderer tend to be the same AAA's I don't buy because of DRM.
TheNamelessOne_PL: Using Windows 7 is basically begging for hackers or malware to take over your system. Windows 7 has not received security updates in, like, 3 years. It's a massive, gaping security hole just waiting to be taken advantage of by any half-decent script-kiddie.
This is factually incorrect. W7 ESU (Extended Support Updates) are available until 2023 (there's a simple tweak that allows anyone to download them, just as with the previous 'zeffy' utility that unlocked Microsoft's fake locking of updates on Gen 7 Intel / Ryzen CPU's). Whilst I wouldn't use W7 today as some outward facing financial server, "security" issues (most of which use fringe attack vectors), are laughably overrated for a simple (mostly) offline gaming rig. "Hacking" in the real world does not involve hackers performing 100m individual attacks vs 100m individual consumer PC's sitting behind a firewalled NAT + dynamic IP one by one, they target online corporate databases containing millions of records (and they still get hacked despite running Windows 10-11 Server) then rapidly "flip" the database on the dark-web for Bitcoin. That is real-life hacking.
Shadowstalker16: Likelihood of malware infection depends a lot more on the user than the OS. Someone visiting shady sites, clicking on shady links and attachments and running suspicious .exes are still very likely to run into malware issues no matter what OS they use.
^ This. Someone who disables a lot of unwanted services on W7 + runs a whitelisted firewall is already more practically secure than the average W10 user sitting there with Remote Desktop Configuration, Remote Desktop Services, Remote Registry, Secondary Login, Windows Remote Management all enabled +
"everything gets to talk through the firewall without question by default" firewall settings. 2FA /
SCA for banking has done far more to stop account hijacks than Windows Updates ever did (the same ones that delete users data
once,
twice,
three years in a row...)
As for the train-wreck that is W11, compulsory TPM for W11 is far more about gradually introducing Remote Attestation (hardware DRM,
already visible on some new anti-cheat software that locks games to TPM chips) than "protecting" Windows users. Same goes for "Smart App Control" a shiny new "security" feature that
"uses AI and Microsoft's cloud knowledge base to check every app that runs, blocking anything unsigned, unfamiliar, or known to be malicious". Sounds great for morons who open "Free Money.exe" e-mail attachments until you realise you've just added an OS level DRM remote kill-switch that can mass block thousands of unsigned DRM-Free older game, game mods, source ports, etc, at the flip of a single switch far more than Denuvo / SecuROM ever could. This stuff is far more about "bait & switching" hardware DRM under the guise of 'security' than it is 'protecting' anyone from real world threats.