It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
low rated
avatar
StingingVelvet: I mean some people still watch SD cable stations or DVDs, in an age where 1080p is widely giving way to 4k video. Some people are fine listing to music on Youtube and others demand high quality FLAC files. Everything is relative. I don't think that means the differences aren't obvious though, I think it more means you don't care or prioritize it, which is fine. We all have different priorities.
It is that to some extent, but also I just don't notice much of a difference myself...and I have seen both side by side to compare.
(There is a bit, but not worth the price tag or needed for me, imo)

Also all that aside, there is still the limit of what the brain/eye can perceive(which varies a bit with some people but is usually within the same "ballpark")....anything above that that a person claims to see a difference with would be very likely to be confirmation bias/etc(and also right now we are close to getting diminishing returns as per cost/increases with some tech).

avatar
StingingVelvet: I'm telling you I love how 140fps feels, but it's frustrating because so many games are locked to 60fps, making for a back and forth that makes 60fps feel even worse.
Not to offend(intentionally)much, but you remind me a but of all those "better graphics make the game more than gameplay does" people back in the 90s/2000s.....as long as the game is fun you should be able to put aside all that and have fun regardless.

Heck, I play text adventures after playing high resolution games and don't go "omg this game is jarring or not as good as the graphics suck or are non existent" or similar. :)
Post edited December 12, 2019 by GameRager
avatar
GameRager: Also all that aside, there is still the limit of what the brain/eye can perceive(which varies a bit with some people but is usually within the same "ballpark")....anything above that that a person claims to see a difference with would be very likely to be confirmation bias/etc(and also right now we are close to getting diminishing returns as per cost/increases with some tech).
I don't mean this in an angry, aggressive way, but... you need glasses if you think 1080 to 4k is anywhere near this kind of imperceptible difference.

avatar
GameRager: Not to offend(intentionally)much, but you remind me a but of all those "better graphics make the game more than gameplay does" people back in the 90s/2000s.....as long as the game is fun you should be able to put aside all that and have fun regardless.

Heck, I play text adventures after playing high resolution games and don't go "omg this game is jarring or not as good as the graphics suck or are non existent" or similar. :)
You and others keep taking this track, but it's erroneous. Expressing disappointment in a steak you ordered medium rare coming out well done does not imply you can never eat hamburger, or a ham sandwich. I own 500 games on GOG and my favorites mostly come from the 1995-2003 or so time period. Obviously I am not a "high-end gaming only" guy. Framing me that way to belittle my points is just tiresome.

This is very simple, and being misconstrued. I'm saying if you buy a 144hz monitor you might fall in love with 140fps gaming and then be disappointed by how many games you can't enjoy it with. It's that simple, and has zero to do with game quality itself.
low rated
avatar
StingingVelvet: I don't mean this in an angry, aggressive way, but... you need glasses if you think 1080 to 4k is anywhere near this kind of imperceptible difference.
When compared to 1080p and 1440 it's not that big a difference, imo....and you keep discounting/pushing aside the bit I keep writing about one's brain not noticing much difference once we hit certain limits(in the future).

I have no doubt some will still claim(this is not to discount your own claims here, btw) that 32 or 64 K looks much much better than 16 K/etc, due to them wanting to see a difference....even though their brains wouldn't be able to notice much of such differences after a certain point.

avatar
StingingVelvet: You and others keep taking this track, but it's erroneous. Expressing disappointment in a steak you ordered medium rare coming out well done does not imply you can never eat hamburger, or a ham sandwich. I own 500 games on GOG and my favorites mostly come from the 1995-2003 or so time period. Obviously I am not a "high-end gaming only" guy. Framing me that way to belittle my points is just tiresome.
I wasn't trying to.....note that I said "you remind me of" not "you are"....by that I meant I saw a partial resemblance of such types of people in some of your posts about 144Hz over 60Hz.

Also I never said/meant to imply one couldn't enjoy both.....I myself enjoy many levels of quality/frame rates/etc.

avatar
StingingVelvet: This is very simple, and being misconstrued. I'm saying if you buy a 144hz monitor you might fall in love with 140fps gaming and then be disappointed by how many games you can't enjoy it with. It's that simple, and has zero to do with game quality itself.
Well it has to do a bit with graphical quality....you are asserting some might love/like 144Hz so much that lower rates would pale in comparison(in some cases) for some people due to the difference in quality.

=========================================
All the above aside, I enjoy games regardless of quality level(with some exceptions & keeping in mind I try to run them in the best settings I can to avoid artifacting/etc) and while this sort of thing(4K, 144Hz) might be nice, I don't notice much of a difference from 1080p to justify the price tag....nice for those who can afford it(and especially for whom it would be a major upgrade), though.
Post edited December 12, 2019 by GameRager
low rated
avatar
GameRager: Also all that aside, there is still the limit of what the brain/eye can perceive(which varies a bit with some people but is usually within the same "ballpark")....anything above that that a person claims to see a difference with would be very likely to be confirmation bias/etc(and also right now we are close to getting diminishing returns as per cost/increases with some tech).
avatar
StingingVelvet: I don't mean this in an angry, aggressive way, but... you need glasses if you think 1080 to 4k is anywhere near this kind of imperceptible difference.

avatar
GameRager: Not to offend(intentionally)much, but you remind me a but of all those "better graphics make the game more than gameplay does" people back in the 90s/2000s.....as long as the game is fun you should be able to put aside all that and have fun regardless.

Heck, I play text adventures after playing high resolution games and don't go "omg this game is jarring or not as good as the graphics suck or are non existent" or similar. :)
avatar
StingingVelvet: You and others keep taking this track, but it's erroneous. Expressing disappointment in a steak you ordered medium rare coming out well done does not imply you can never eat hamburger, or a ham sandwich. I own 500 games on GOG and my favorites mostly come from the 1995-2003 or so time period. Obviously I am not a "high-end gaming only" guy. Framing me that way to belittle my points is just tiresome.

This is very simple, and being misconstrued. I'm saying if you buy a 144hz monitor you might fall in love with 140fps gaming and then be disappointed by how many games you can't enjoy it with. It's that simple, and has zero to do with game quality itself.
Ah, I think this clarified things a bit. If I read you correctly you're suggesting that you play a game or games at a higher frame rate and get used to the benefits that provides, playing some games at 60fps afterwards can be a noticeably lesser experience.

Linus Tech Tips recently did a video with Shroud and a few others where they tested gaming at 60/144/240Hz displays and actually measured the benefits scientifically, it was quite interesting. Those used to higher framerate displays were quick to express how gaming at 60fps was throwing off their reaction time etc. and it showed up in the testing measurements also.

This was entirely with competitive FPS shooters mind you, where it would matter the most and was probably the best type of games to do the test with. A lot of games wouldn't really have much of a difference however, such as RTS games for example. It really depends on the type of games someone plays, and if they tend to be the types of games where monitor refresh rate has a big advantage or not, such as competitive FPS. In that case it would be a horrible experience to go back to 60fps most likely.

I feel similarly with going back to 1920x1200 or lower resolutions when I've been used to 2560x1600 for the last 6+ years. You get used to something and then when it's downgraded you notice it right away and don't want the downgrade. It's not like that with all games though, some games it doesn't really matter much at all, but some it is horrible, especially if the resolution the game is using is not an integer division of the native panel resolution.

I've more or less sidestepped the high-fps side of things in gaming due to my preference for allocating wallet to higher resolution, 16:10 aspect ratio, and color reproduction and other features. As far as I know no company makes monitors that are made with BOTH professional features AND gaming in mind, it's usually one or the other currently. I have had a bit of exposure though as the laptop display is 17.3" 120Hz, and I noticed the high frame rate right away in any FPS/TPP games that actually support 120fps properly. While I definitely notice the difference easily, I'm still comfortable playing the games at 60fps too if I can enable other features/eye candy. If I was into competitive FPS I'd probably go with 120 though.
avatar
skeletonbow: Ah, I think this clarified things a bit. If I read you correctly you're suggesting that you play a game or games at a higher frame rate and get used to the benefits that provides, playing some games at 60fps afterwards can be a noticeably lesser experience.
Yes. The thread was spawned because I decided to replay KotOR 2 and there's a bug at higher than 60fps framerates that makes your character unmovable after combat. It's like the 10th game since I got my monitor 6 months ago where I've had to limit my framerate to 60 to play it, and it's very frustrating because I'm very in love with 140fps now.

avatar
skeletonbow: This was entirely with competitive FPS shooters mind you, where it would matter the most and was probably the best type of games to do the test with. A lot of games wouldn't really have much of a difference however, such as RTS games for example. It really depends on the type of games someone plays, and if they tend to be the types of games where monitor refresh rate has a big advantage or not, such as competitive FPS. In that case it would be a horrible experience to go back to 60fps most likely.
I don't play competitive games, so for me it's more about the "feel" of the game. Outer Worlds is a good example. I went back and forth between ultra settings with 60fps and medium settings with 120fps or so. The difference was immediately noticeable to me. Not just in responsiveness, but also less motion blur or ghosting, and less choppy turning or swift camera pans. LCD motion usually isn't the best, and 140fps helps with it a lot. So I chose to use medium settings to get 100+fps, because it was worth it to me.
low rated
avatar
GameRager: I cannot say as to 144Hz, but I HAVE seen 4K and 1080p demos side by side in stores and(to me) it's not that much of an improvement.
And on what size of screen did you see both of them? Because resolution is meaningless without that "little detail".
(distance is important too, but not that relevant in the computer screen context)
Post edited December 12, 2019 by teceem
low rated
avatar
teceem: And on what size of screen did you see both of them? Because resolution is meaningless without that "little detail".

(distance is important too, but not that relevant in the computer screen context)
Side by side displays of proper size and at the proper distance for best effect, and I still don't notice a ton of improvement....yes, some slight improvement, but (as I said above) not enough to justify the price tag & imo less needed compared to things like good story/gameplay/characters/etc.

(And all the above said/aside, the main point I said above still stands: i.e. past a certain future threshold[variable with some people to a slight degree] people won't be able to notice the changes, and it will mostly be confirmation bias driving any supposed "improvements" they see....yet companies will make bigger and better improvements the human mind/eye won't likely be able to perceive

Well unless we augment our senses/brains, perhaps...but that is far on, I think.)
Post edited December 13, 2019 by GameRager
Go to Nvidia Control Panel and set framerate to application controlled.
low rated
avatar
teceem: And on what size of screen did you see both of them? Because resolution is meaningless without that "little detail".

(distance is important too, but not that relevant in the computer screen context)
avatar
GameRager: Side by side displays of proper size and at the proper distance for best effect, and I still don't notice a ton of improvement....yes, some slight improvement, but (as I said above) not enough to justify the price tag & imo less needed compared to things like good story/gameplay/characters/etc.

(And all the above said/aside, the main point I said above still stands: i.e. past a certain future threshold[variable with some people to a slight degree] people won't be able to notice the changes, and it will mostly be confirmation bias driving any supposed "improvements" they see....yet companies will make bigger and better improvements the human mind/eye won't likely be able to perceive

Well unless we augment our senses/brains, perhaps...but that is far on, I think.)
From 60 to 120 it's quite noticeable both visibly, and more importantly reaction time. Above that it continues to give improved performance with regards to reaction time but it is a law of diminishing returns. I agree that the price premium for a higher frame rate is not worth it for my own gaming and usage and for many others out there also, as the price to benefit ratio has to make sense to the person making the decision for their own use case. There's no right or wrong, just what someone values more, keeping more money in their wallet, or minimizing reaction time/latency in games where it matters and is noticeable. I notice it at 120 compared to 60 but for my current gaming habits it is "cool" but doesn't matter to me enough that I'd spend money on it on purpose. In my case, the laptop I chose for other reasons just happened to also have 120. :)

As for whether or not someone will notice it, is more of a factor of what game they are playing, how they're playing it and whether it matters in that type of game and gameplay or not. It's primarily a massive benefit for reducing input latency and reaction time in first person shooter games, in particular competitive ones. If someone isn't playing FPS games they can still potentially notice the smoother animation, but it wont necessarily have major benefits like it does in FPS games. It wont turn someone from a horrible player into a great player, but it will improve their reaction time significantly and increase their kill rate.

Linus Tech Tips has done 2 videos on the topic in recent times, here is the latter more scientific one, testing with both expert players like Shroud, and with potato players like Linus himself and some other tech guy I forget who are both mediocre. The results they got from all of the tests are quite interesting.

https://www.youtube.com/watch?v=OX31kZbAXsA

and here is his previous video on the topic which wasn't quite as in depth, but still quite interesting.

https://www.youtube.com/watch?v=tV8P6T5tTYs


The benefits are quite clear for all players whether they "can see it or not" really, it's all about reaction time and even potato players get a massive boost in reaction time and seriously improved kill rate. It all comes down in the end again to "Is the benefit worth the price?" and there is no universal answer, it's an individual decision about whether the absolute benefit it provides is worth the price premium.

For me the answer is no for my own gaming, but yes for others who are going to benefit from it and care enough to spend the money. I'd much rather have higher resolution, greater pixel density, better colour reproduction, 16:10 aspect ratio and other characteristics in a monitor that matter more to me than > 60fps or Freesync/Gsync etc. But that's only something that had to be decided over the last few years... in a number of years time the decision will be made for all of us more likely than not by vendors making all monitors capable of high frame rates so the question becomes more moot. :)

The more important question might be, whether game developers are going to start patching their games to work properly at higher frame rates. Sadly, the answer is probably no.

And we haven't even got to throwing proper HDR into the mix yet on top of all of that... :)
avatar
skeletonbow: The more important question might be, whether game developers are going to start patching their games to work properly at higher frame rates. Sadly, the answer is probably no.
One of the key problems is that internal game logic is almost always tied to a given maximum refresh rate. Games that have simple mechanics behind the scenes (like twitch shooters) can process 144 cycles a second with ease, but when you start throwing complex physics models and massive open world actor updates into the mix, having such a high logic refresh rate can cause your system requirements to go sky high, locking out a lot of low-end and mid-level PC users. Basically, on most devices you're wasting a lot of CPU cycles on useless game logic updates that only people with high-end hardware can benefit from.

You can fix this by having half-refreshes for certain aspects like the physics model (updating every second cycle), but that usually needs to be planned in from the outset.

This, incidentally, is why many console ports locked at 30fps are much better performers on consoles compared to more scalable games - they have their internal logic also locked at 30fps. Sometimes you'll get PC ports that support higher frame rates, but end up introducing bugs in the process (e.g. Spyro Reignited Trilogy and the infamous jumping bug, which caused the game to be unplayable on 144Hz monitors and a little more difficult with 60Hz refresh).
Post edited December 13, 2019 by _ChaosFox_
I guess the "problem" is two-fold:

1. Many (mainly older) games can't be played at 144 fps, either they are locked max 60 fps (or even lower) or something breaks in the game physics at over 60 fps.

That is life I guess. I recall when I was playing System Shock 2, bumping the resolution to 1920x1080 made the user-interface and all the text in the game too small to read (for me anyway, on a smaller laptop screen). Also with many games (probably also SS2), if you use newer 16:9 resolutions, the image gets stretched.

My solution? I played the game at 800x600, and dealt with it. Hey it is an older game, so that is a more genuine experience anyway, playing the game as it was played back then.

2. Newer games can't reach 144 fps with 4K resolutions and full details.

What can I say? Buy a faster graphics card, sucka!
low rated
avatar
skeletonbow: From 60 to 120 it's quite noticeable both visibly, and more importantly reaction time. Above that it continues to give improved performance with regards to reaction time but it is a law of diminishing returns. I agree that the price premium for a higher frame rate is not worth it for my own gaming and usage and for many others out there also, as the price to benefit ratio has to make sense to the person making the decision for their own use case. There's no right or wrong, just what someone values more, keeping more money in their wallet, or minimizing reaction time/latency in games where it matters and is noticeable.
These are all good points, and what I was also trying to illustrate before.

Agreed 100% :)

avatar
skeletonbow: As for whether or not someone will notice it, is more of a factor of what game they are playing, how they're playing it and whether it matters in that type of game and gameplay or not. It's primarily a massive benefit for reducing input latency and reaction time in first person shooter games, in particular competitive ones. If someone isn't playing FPS games they can still potentially notice the smoother animation, but it wont necessarily have major benefits like it does in FPS games. It wont turn someone from a horrible player into a great player, but it will improve their reaction time significantly and increase their kill rate.
For non competitive/MP games one can accomplish such by lowering the difficulty/changing some enemy spawn rates(in some games), getting more used to where and when to hide/shoot, or just plain cheating.

In competitive gaming one can also train to know when/where to wait and shoot as well to make up somewhat for lesser monitor rates(though likely not as well as if they had such hardware).

avatar
skeletonbow: Linus Tech Tips has done 2 videos on the topic in recent times, here is the latter more scientific one, testing with both expert players like Shroud, and with potato players like Linus himself and some other tech guy I forget who are both mediocre. The results they got from all of the tests are quite interesting.
An aside: I find it a bit funny that the guy who owns the channel isn't as good as others he features on his channel.

avatar
skeletonbow: The benefits are quite clear for all players whether they "can see it or not" really, it's all about reaction time and even potato players get a massive boost in reaction time and seriously improved kill rate.
Yes but if it improved so much their brains/eyes couldn't track it then that extra response wouldn't help them as much(if the tech got that far, that is).

avatar
skeletonbow: For me the answer is no for my own gaming, but yes for others who are going to benefit from it and care enough to spend the money. I'd much rather have higher resolution, greater pixel density, better colour reproduction, 16:10 aspect ratio and other characteristics in a monitor that matter more to me than > 60fps or Freesync/Gsync etc.
That's what I look for for the most part as well. :)

avatar
skeletonbow: But that's only something that had to be decided over the last few years... in a number of years time the decision will be made for all of us more likely than not by vendors making all monitors capable of high frame rates so the question becomes more moot. :)
Hmm, I wonder if we'll see a GoM(Good Old Monitors) in the future? ;)
=====================================================

avatar
timppu: Also with many games (probably also SS2), if you use newer 16:9 resolutions, the image gets stretched.
The SS2Tool patched version supports those resolutions and looks pretty nice to boot....as evidenced by my screenshots in my let's plays(if you want something to check to see if my claims pass muster). :)

avatar
timppu: What can I say? Buy a faster graphics card, sucka!
Newer graphics cards with better specs allow for better games to be made/sold, and better games help sell more newer video cards....it's a lucrative dual system kind of marketplace.
Post edited December 13, 2019 by GameRager
low rated
avatar
skeletonbow: Linus Tech Tips has done 2 videos on the topic in recent times, here is the latter more scientific one, testing with both expert players like Shroud, and with potato players like Linus himself and some other tech guy I forget who are both mediocre. The results they got from all of the tests are quite interesting.
avatar
GameRager: An aside: I find it a bit funny that the guy who owns the channel isn't as good as others he features on his channel.
Yeah, he's a bit of a doofus at times. :)

avatar
skeletonbow: The benefits are quite clear for all players whether they "can see it or not" really, it's all about reaction time and even potato players get a massive boost in reaction time and seriously improved kill rate.
avatar
GameRager: Yes but if it improved so much their brains/eyes couldn't track it then that extra response wouldn't help them as much(if the tech got that far, that is).
Sure, as we improve things we reach a law of diminishing returns, but additional improvements to things like resolution, pixel density, refresh rate, bit depth, bits per channel do all continue to provide benefits, they just get smaller over time and we'll eventually reach a point where the additional expense wont be worth it for longer and longer periods of time. With regards to frame rate it's not clear where that point is but 240 seems to be close to a point of no tangible further gain going much higher. Resolution wont top out for a while to come yet because it is constrained by pixel density also, so it depends on the physical size of the display and the density. 8k displays for example make a lot more sense if they're 85" than if they're 13". :)

We've got HDR with 10 bit per channel but not really being used barely anywhere yet. That will catch more in the future but I doubt it'll be a big thing for a number more years. I don't think we'll see bits per channel increase much in the future beyond this, it's been a long time making the move from 8bpc to 10bpc etc.

So within a decade or two we'll likely have it the limits of what is feasible to do with these parameters and improve in other areas instead I guess.


avatar
skeletonbow: But that's only something that had to be decided over the last few years... in a number of years time the decision will be made for all of us more likely than not by vendors making all monitors capable of high frame rates so the question becomes more moot. :)
avatar
GameRager: Hmm, I wonder if we'll see a GoM(Good Old Monitors) in the future? ;)
I've got a bunch of CRTs in my basement if anyone's looking for one... :)
low rated
avatar
skeletonbow: Yeah, he's a bit of a doofus at times. :)
*YT channel name unclear, burned down half of house*

avatar
skeletonbow: So within a decade or two we'll likely have it the limits of what is feasible to do with these parameters and improve in other areas instead I guess.
Yes, like holo displays and mayhaps true VR if possible o.0

avatar
skeletonbow: I've got a bunch of CRTs in my basement if anyone's looking for one... :)
Well they won't be doing much looking if they stare at em for too long. ;)
avatar
timppu: What can I say? Buy a faster graphics card, sucka!
After the new consoles come out everyone will be doing that, but unless you get a 3080ti I'd bet the same issue persists. On max settings anyway. I've been turning things down to get 100+fps and it's usually alright. There are some really demanding games I couldn't get that high no matter what I did though, like Control.

Either way at least it makes sense that new games won't run a super high framerates. They're designed to run at 30fps on current consoles usually, so even 60fps is using a lot of PC's added power. It's the older games that break over 60fps I find more disappointing.

P.S. 99% certain there's a proper widescreen mode for SS2 in the modern patches or mods or whatever.