thegreyshadow: While my paltry Intel UHD 620 onboard GPU can play some very good games with great results, there are others which make it struggle (such as The Witcher 2, a game already 10 years old!).
Making games which are unplayable (at least on the lowest end) on systems without discrete GPUs is counterproductive to game companies.
First of all, GPU production is experiencing (and has experienced since the global cryptocurrency explosion) chronic shortages and problems in availability.
Secondly, there is an absurdly large market of PCs with onboard GPUs.
Excluding such a computer base from your latest game simply impose an unreasonable limit on your potential audience.
Games should have Crysis-like and ray-tracing GPU consumption levels on their highest tiers, I can get that. But they should also ensure that their games should be playable on the lowest setting with onboard GPUs such as Intel UHD chipsets.
What do you think?
Not everybody in the entire world needs or wants a GPU; and the extra $ cost it comes with. Many people just need an office machine or Internet machine; they don't NEED all this extra iGPU or discrete-GPU stuff. Having that stuff on board, that costs $ to both the developer/maker of the iGPU and then gets of course passed to the consumer.
Intel is not going to provide a great iGPU, which developers can take advantage of. Intel is really in the CPU business, not GPU business. And they've never had great iGPU's.
AMD might be the crew to look at - as they bought out ATI years ago; and they also make CPU's. So, if anyone's to make a solid iGPU, it would be them.
NVidia also might also be someone to look at - if their deal w/ ARM goes through....as then they can corner both the CPU and GPU market.
But even then, most iGPU's are so NOT in the league of what current modern PC discrete GPU's are doing. GPU's are WAY ahead of iGPU's again - especially now w/ DLSS and RTX support. We've had that next level area again, as now the new consoles are out too (i.e. PS5 and X Box Series X).
So, w/ those consoles here - expect PC requirements soon to get very similar to what the PS5 and XSX has for the bare minimums.
So, if people really want a good GPU - yep, they're basically going to have to pony-up for those PC's (desktops or laptops) providing them inside and/or pony-up for the desktop part itself...provided you can actually find those.
For those buying laptops - you're going to have to buy a gaming laptop with a 2000 or 3000 card to play some current modern PC games properly; especially w/ new consoles here & all.
Shadowstalker16: In terms of actually optimizing for recent hardware, I think all devs should do it. But I'd imagine that it would depend on the skill of the people coding it and their publisher's will when it comes what and where they invest their time. If the design philosophy is just to vomit out open unoptimized worlds, I'd imagine quantity would take priority over (optimization) quality.
.Keys: It's interesting. I've seem people here say that OP's exemple (Far Cry 2) is an old game (2008) and it should run well in any Onboard GPUs nowdays. (Not saying you did that.)
But
Metal Gear Solid V: The Phantom Pain is a
2015 game and runs in integrated graphics in lowest resolution 30 to 50 fps depending on your CPU.
FoxEngine is really well optmized and if they did it, why others can't? ('Rethoric' question).
And what you said is completely true. (Yes Im looking at you Ubisoft.)
You also have to remember - Far Cry series' old Crytek-based engine wasn't built for multi-cores CPU's too. When Far Cry 1 was built on Crytek's Engine, that engine was aiming for single-core PC's and were expecting games to heavily increase the Ghz speed, not the core-count. Far Cry 5 still isn't the best running game and is still on that same engine, even if they improve, tweak, or whatnot - and isn't really built entirely for multi-core CPU's.
While many of the Ubi games and Far Cry games are heavy on the CPU, they're also heavy on the GPU too. Even FC5, still isn't really built for multi-core CPU's.
Would we really expect Crytek or UbiSoft to go back in and improve performance in an old engine and big open-world style game w/ AI, explosions, combat, and action everywhere?
Something like the FoxEngine, which is much more modern of an engine, probably was built from scratch for multi-cores CPU's. So, it's going to straight-up run better and utilize CPU's better too.