It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Everyone raves about high framerate gaming, and they're right. When running a game at 144fps with no issues, it's an incredibly smooth feeling experience. However I am finding that at least half the time, if not more, they cause issues that ruin that feeling.

First off, tons of old games break when put above 60fps. I'm making this thread because Star Wars hype had me reinstall KotOR 2 today, and even though it has a 144hz setting it still breaks the game, messing with physics and stopping you from being able to move after combat. The solution on every forum? Limit fps to 60. This kind of thing happens all the time, even with relatively recent games like Fallout 4. Games are so designed around the 60fps standard (or even 30fps on consoles) that 144fps just causes endless problems.

Secondly, once you get used to 144fps it's hard to go back to 60... which sucks because not only do a ton of games require you to use 60fps, but also newer games are impossible to run at 144fps, even with a crazy expensive video card. I got The Outer Worlds to run at 120ish but playing on medium, but I could have been playing on ultra at a locked 60. How far down will I have to put the settings on Cyberpunk? Probably insanely low, or just deal with 60fps.

Anyway this is just a thread to vent about how I wish I didn't get a 144hz monitor. I only get the true, smooth 144fps experience on VERY rare occasions, and before getting that experience I never knew what I was missing.
Post edited December 09, 2019 by StingingVelvet
Speaking of which, I will never understand the recent trend of getting 144+hz screens for laptops. Let's not even talk about 4k screens on them silly things. The hardware power to drive that just isn't there. All that means it needs more power, create more heat, and will end up sounding like a goddamn jet engine ready for take-off. That's why I made a point of getting a 60hz screen in the lappie I just bought over the Black Friday weekend.

So yeah, this whole sentiment of "more = better" is just stupid.
Post edited December 09, 2019 by Mr.Mumbles
I see 144Hz as a bonus. Newer games seem to more frequently be able to make use of it. Other than that, isn't limiting FPS a simple affair? For example, AMD drivers have an in-built slider, where you can select on an individual profile level the desired FPS. I've never had to use it, since I've only recently got a 144Hz monitor, but I know it's there.

Now that you mention it... I've been playing Druidstone at 144Hz, and I've been getting plenty of crashes, to the point that in longer missions, I ended up saving after every turn. I assumed the game was broken, but I'll have to test if it's stable at 60FPS.
It is nice to be able just to compare components by looking at which number is bigger.
I have a 144mhz monitor with gsync. I have set in the monitor to limit it to 90 and not had any issues with it on anything I played. Am very happy with it.

One thing that does really wind me up though. It’s an ultra wide, and most games, even though these have been around for donkey years, still don’t properly support them. Either messed up FOV or clipped/shrunk. It’s really just not good enough. Darksiders Genesis is the latest title, which if your running 3440x1440 it’s all zoomed in (scaled to low edge) which ruins FOV. It’s not like this is new tech!
avatar
StingingVelvet: Everyone raves about high framerate gaming, and they're right. When running a game at 144fps with no issues, it's an incredibly smooth feeling experience. However I am finding that at least half the time, if not more, they cause issues that ruin that feeling.
When they work well they're good, but physics tied to 60fps is an issue. Sudden fps drops from 144Hz to 60Hz are also more jarring than 75Hz to 60Hz, and the short-sharp drops are something even Freesync / Sync don't always cope well with. "Expectation drift" is also a real thing, ie, if you get used to 144Hz where you can't go back to 60Hz, from then on gaming gets exponentially more expensive (top-end CPU & GPU needed for heavier games, etc). I toyed for a while between 144Hz vs 75Hz and due to playing entirely single player games, I don't regret the latter. Freesync goes a long way for perceived smoothness at lower fps and it's now supported on nVidia cards too.
avatar
nightcraw1er.488: One thing that does really wind me up though. It’s an ultra wide, and most games, even though these have been around for donkey years, still don’t properly support them. Either messed up FOV or clipped/shrunk. It’s really just not good enough. Darksiders Genesis is the latest title, which if your running 3440x1440 it’s all zoomed in (scaled to low edge) which ruins FOV. It’s not like this is new tech!
What engine does it use? Believe it or not Ultrawide support for Unreal Engine 4 is far worse than on UE1-3 due to the way they've changed the default FOV lock. ie, UE4 devs have to manually set a different default if they want Ultrawide to actually have extra width and not zoom and crop.
Post edited December 09, 2019 by AB2012
avatar
StingingVelvet: Everyone raves about high framerate gaming, and they're right. When running a game at 144fps with no issues, it's an incredibly smooth feeling experience. However I am finding that at least half the time, if not more, they cause issues that ruin that feeling.

First off, tons of old games break when put above 60fps. I'm making this thread because Star Wars hype had me reinstall KotOR 2 today, and even though it has a 144hz setting it still breaks the game, messing with physics and stopping you from being able to move after combat. The solution on every forum? Limit fps to 60. This kind of thing happens all the time, even with relatively recent games like Fallout 4. Games are so designed around the 60fps standard (or even 30fps on consoles) that 144fps just causes endless problems.

Secondly, once you get used to 144fps it's hard to go back to 60... which sucks because not only do a ton of games require you to use 60fps, but also newer games are impossible to run at 144fps, even with a crazy expensive video card. I got The Outer Worlds to run at 120ish but playing on medium, but I could have been playing on ultra at a locked 60. How far down will I have to put the settings on Cyberpunk? Probably insanely low, or just deal with 60fps.

Anyway this is just a thread to vent about how I wish I didn't get a 144hz monitor. I only get the true, smooth 144fps experience on VERY rare occasions, and before getting that experience I never knew what I was missing.
Interesting because I'm considering upgrading from 1080p/75hz and considering a 144hz/1440p combo but maybe I should go for a lower refresh monitor.
avatar
StingingVelvet: Everyone raves about high framerate gaming, and they're right. When running a game at 144fps with no issues, it's an incredibly smooth feeling experience. However I am finding that at least half the time, if not more, they cause issues that ruin that feeling.
avatar
AB2012: When they work well they're good, but physics tied to 60fps is an issue. Sudden fps drops from 144Hz to 60Hz are also more jarring than 75Hz to 60Hz, and the short-sharp drops are something even Freesync / Sync don't always cope well with. "Expectation drift" is also a real thing, ie, if you get used to 144Hz where you can't go back to 60Hz, from then on gaming gets exponentially more expensive (top-end CPU & GPU needed for heavier games, etc). I toyed for a while between 144Hz vs 75Hz and due to playing entirely single games, I don't regret the latter. Freesync goes a long way for perceived smoothness at lower fps and it's now supported on nVidia cards too.
avatar
nightcraw1er.488: One thing that does really wind me up though. It’s an ultra wide, and most games, even though these have been around for donkey years, still don’t properly support them. Either messed up FOV or clipped/shrunk. It’s really just not good enough. Darksiders Genesis is the latest title, which if your running 3440x1440 it’s all zoomed in (scaled to low edge) which ruins FOV. It’s not like this is new tech!
avatar
AB2012: What engine does it use? Believe it or not Ultrawide support for Unreal Engine 4 is far worse than on UE1-3 due to the way they've changed the default FOV lock. ie, UE4 devs have to manually set a different default if they want Ultrawide to actually have extra width and not zoom and crop.
Yep, seems to be unreal 4. So, it’s either a half broken unreal engine or telemetry bloatware unity engine backends.
avatar
StingingVelvet: Everyone raves about high framerate gaming, and they're right. When running a game at 144fps with no issues, it's an incredibly smooth feeling experience. However I am finding that at least half the time, if not more, they cause issues that ruin that feeling.

First off, tons of old games break when put above 60fps. I'm making this thread because Star Wars hype had me reinstall KotOR 2 today, and even though it has a 144hz setting it still breaks the game, messing with physics and stopping you from being able to move after combat. The solution on every forum? Limit fps to 60. This kind of thing happens all the time, even with relatively recent games like Fallout 4. Games are so designed around the 60fps standard (or even 30fps on consoles) that 144fps just causes endless problems.

Secondly, once you get used to 144fps it's hard to go back to 60... which sucks because not only do a ton of games require you to use 60fps, but also newer games are impossible to run at 144fps, even with a crazy expensive video card. I got The Outer Worlds to run at 120ish but playing on medium, but I could have been playing on ultra at a locked 60. How far down will I have to put the settings on Cyberpunk? Probably insanely low, or just deal with 60fps.

Anyway this is just a thread to vent about how I wish I didn't get a 144hz monitor. I only get the true, smooth 144fps experience on VERY rare occasions, and before getting that experience I never knew what I was missing.
That's preciselly the reason I went with 60Hz instead of 144 for my personal use, playing older and simpler games, mostly single player, don't require high refresh rate. Many 144Hz monitors look like crap when used at 60Hz.
Freesync helps a lot but don't always work on older games.
With that said, most games support and work better at 75Hz rather than 60.
avatar
theslitherydeee: It is nice to be able just to compare components by looking at which number is bigger.
Pretty much this.
Every freaking big shop list the price and any arbitrary meta number to justify the pricing.

If anyone want a new monitor, please go to a shop and watch them because the numbers (except maybe resolution) mean absolutelly nothing, including response times, contast, brightness etc...

Btw, Digital Foundry recently did a video talking about the benefits of older CRT monitors :D (there are some nasty donsides as well).
Post edited December 09, 2019 by Dark_art_
Oh my, I am sooo behind the curve. 60fps feels like shit if you are used to 144fps? I presume 1920x1080 also seems like blocky mess if you are used to 3840x2160.

I am happily playing games like Team Fortress 2 at 1280x720 resolution and running either in 60 or even 30 fps. If I get 60fps on 1920x1080, I am more than happy and couldn't really ask for more.

And so it goes...
avatar
Mr.Mumbles: Speaking of which, I will never understand the recent trend of getting 144+hz screens for laptops. Let's not even talk about 4k screens on them silly things. The hardware power to drive that just isn't there.
The main benefit of 4k on laptops (and arguably desktop to some extent) is productivity. Doesn't take a lot of hardware to drive 4k in that context, and makes all the difference.
I didn't know about this issue, expecially for relative new titles.

I don't even understand why 3d games even consider framerate for physics O_o

Btw, isn't it possible to simply limit it per game in the GPU panel?
Post edited December 09, 2019 by phaolo
I have a 24 inch, 2560x1440, 165 Hz, G-Sync monitor and it is incredible. For any problematic games, you can pick from a lot of refresh rate limits in the Nvidia control panel, ranging from 24-165. Good thing about G-Sync is that it makes the monitor behave like it natively has the refresh rate you picked. So an easy solution for any game that goes bonkers above 60 Hz without having to change anything in game's config files or using an external limiter.

I've had LESS issues with old games and an indescribably better experience with new games ever since I got it. And playing fast paced games like UT 2004 in 165 Hz is like playing a completely different game.

Not sure about how it is with AMD, but I've had a great experience with Nvidia and G-Sync.

Also, don't get a high refresh rate monitor if you don't have enough horses in your stable to throw into the furnace.
I'm also one for old games, so this thread has been very helpful in educating me on the limited benefits of modern monoitor trends.

I think i'll hold onto my U2711 for another few years - it seems the only benefit would be power efficiency.
avatar
phaolo: I didn't know about this issue, expecially for relative new titles.

I don't even understand why 3d games even consider framerate for physics O_o

Btw, isn't it possible to simply limit it per game in the GPU panel?
Because at one time, framerate was a constant (like at one time how game timing was based on CPU cycles). It's used to measure acceleration in space as a timing mechanism. I learned a lot about this when fiddling with framerates on Dark Souls and dealing with bugs from From Software in DS 2.
Post edited December 09, 2019 by paladin181