Posted March 08, 2021
While my paltry Intel UHD 620 onboard GPU can play some very good games with great results, there are others which make it struggle (such as The Witcher 2, a game already 10 years old!).
Making games which are unplayable (at least on the lowest end) on systems without discrete GPUs is counterproductive to game companies.
First of all, GPU production is experiencing (and has experienced since the global cryptocurrency explosion) chronic shortages and problems in availability.
Secondly, there is an absurdly large market of PCs with onboard GPUs.
Excluding such a computer base from your latest game simply impose an unreasonable limit on your potential audience.
Games should have Crysis-like and ray-tracing GPU consumption levels on their highest tiers, I can get that. But they should also ensure that their games should be playable on the lowest setting with onboard GPUs such as Intel UHD chipsets.
What do you think?
Making games which are unplayable (at least on the lowest end) on systems without discrete GPUs is counterproductive to game companies.
First of all, GPU production is experiencing (and has experienced since the global cryptocurrency explosion) chronic shortages and problems in availability.
Secondly, there is an absurdly large market of PCs with onboard GPUs.
Excluding such a computer base from your latest game simply impose an unreasonable limit on your potential audience.
Games should have Crysis-like and ray-tracing GPU consumption levels on their highest tiers, I can get that. But they should also ensure that their games should be playable on the lowest setting with onboard GPUs such as Intel UHD chipsets.
What do you think?
Post edited March 08, 2021 by thegreyshadow