Posted February 03, 2020
tfishell: Interesting, thanks. Can you explain how a game would be made specifically not work with Intel HDs? Why exclude on purpose?
From what I've gathered and witnessed over the years theres two main schools of reasoning: 1) the old "oh those things are useless, get a REAL gpu!" elitism - where a dev, dev team, producer etc may simply not want to support and have coded in something to prevent the game from running (Fallout 3 requires using a custom exe in many cases because the game detects an intel and simply wont run).
This somewhat continues today with the "not supported" on system specs. Some indie devs are geting pretty bad for it (ive even seen some belittle people on steam discussions and spew elitist gobshitery). Some come round though when they realise that at lower resolutions, you dont need 500 different effects running, and things have come a long way in the last 10 - 15 years.
2) Sponsorship. I dont think its as common now, but back when games on disc were common, the distribution was often sponsored by a GPU manufacturer. With a good deal of games that had certain companies logos flash up at the start and "intel gpus not supported on the back" in an age of increasing DRM, it would make corporate sense.
If i were at liberty to talk openly about it, the best i could supply is anecdotal evidence from friends that worked at some big companies at the time, but for me its enough.
I might be rambling a bit there. Hopefully you get me though.
Post edited February 03, 2020 by Sachys