Abishia: that's why i suggest a 1080P or 4K thats the industry standard (1440P) aint standard and it's crap.
They are all standard PC resolutions with standardised names:-
1080p = FHD 1440p = QHD 2160p = UHD Just because consoles lack the flexibility to use 1440p doesn't mean it isn't a "standard" PC resolution absolutely no different to the fact consoles also can't use VGA, SVGA, XGA, SXGA, etc, yet that doesn't mean they aren't "industry standard" PC resolutions either. 2560x1440 is simply to 1280x720 what 3840x2160 is to 1920x1080 (exact 2:2x multiple).
Abishia: also consoles are made for 1080P or 4K any game ported to PC is optimized for this resolution.
Incorrect. Consoles may operate at 1080p or 4k but that does not mean any console game ported to PC is automatically "optimized" for 4k. See earlier screenshots demonstrating bad UI scaling, speaking of which your own
"Look here's Dragon Age at 4k" completely misses the point it's the
UI / HUD that doesn't scale. The UI-less cutscene screenshot you posted is completely irrelevant to the issue discussed. As the PCGW article on the game states in the 4k section
"UI does not scale with resolution and is extremely small in 4K" (ie, you can "run" it at 4k but it's not very enjoyable to play a game whose buttons, etc, are so tiny you can hardly see them to click on), a common problem with many older games forced to run at 4k especially on smaller 28" 4k monitors). And upscaling 1080p -> 2160p doesn't always work despite the theoretical perfect maths behind it.
Abishia: also steam 67% use 1080P or 4K
-
15.4% use *lower* than 1080p (eg, 720p or laptop screens such as 1366x768)
-
67.12% use 1080p
-
9.19% use 1440p
-
2.37% use 4k
-
2.19% use Ultrawide (usually 2560x1080 or 3440x1440)
Trying to include the 4k figures in with the 1080p and pass them off as being the same is just silly.
Abishia: and yea 4K was very common in 2007 already.
So first it was 2002, then 2006, now 2007, and still none of those are true. Not even close. I get the impression you're a very young person who hasn't a clue what people owned in the 2000's, you bought yourself a shiny new 4k monitor recently and are so wrapped up in e-peen of
"and now everyone must use what I own" you simply can't tell fact from fantasy...
Reality = It took until 2013-2014 before 4k TV's started to become affordable for most people (prior to this the only consumer 4k displays was projectors and plasma TV) and 4k PC monitors followed those. As Brian said, go and actually read a 2002, 2006, 2007-era PC magazine / web review to see what monitors were on sale back then and none were 4k at all. Nowhere is this more obvious than reading old GPU reviews from 2007 era and the complete lack of 4k benchmarking.
Here's what 2007-era GPU's looked like. That's right, 2007-era GPU's had 256MB (0.25GB) to 512MB (0.5GB) VRAM, which are clearly not "4k" capable at all.
Likewise the
typical tested resolutions of that era (1600x1200, 1920x1200 and 2560x1600 tested), were that period we moved from 4:3 to 16:10 and then rapidly onto 16:9. 4k PC gaming was another several years out and even by 2015 when games were starting to be benchmarked at 4k,
they crawled along at 30fps at 4k resolution on the top-end GPU of that year. 4k PC gaming in actual practise (ie, finally having GPU's powerful enough) has only been a thing over the past few years and it's still under 3% of gamers.
"Everyone back in 2002-2007 had 4k" = you really haven't a clue about PC gaming in the 2000's...
Edit: One obvious thing your highly misleading "memory" overlooks - Back in 2007 most monitor connections were still DVI + Dsub. 4k @ 60Hz requires
DisplayPort 1.2 (that was only introduced in 2010) or
HDMI 2.0 (introduced in Sept 2013), with 60Hz DVI maxing out at 2560x1600. So in whatever parallel universe your
"everyone was using 4k monitors in 2002-2007" fantasy came from, your monitors must have all been connected via Magic Beans...