It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
For many years I have used 60hz monitors at 1080p and been more than happy. A few months ago I purchased a 60-144hz 1080p monitor and since then I have been so used to higher framerates I find it hard going back to 60hz.

I know many hate vsync but I love playing all my games with my Titan X at a butter smooth vsynced 100fps or 120fps (game dependent). It looks soooo much nicer.
Recently, I was tempted to jump to 1440p but if this means dropping to 60hz/60fps I am not sure.

So my question is -
Is this resolution jump really worth the smoothness sacrifice I am enjoying with my 100+ framerates?

Thanks in advance.
avatar
TheHoff: For many years I have used 60hz monitors at 1080p and been more than happy. A few months ago I purchased a 60-144hz 1080p monitor and since then I have been so used to higher framerates I find it hard going back to 60hz.

I know many hate vsync but I love playing all my games with my Titan X at a butter smooth vsynced 100fps or 120fps (game dependent). It looks soooo much nicer.
Recently, I was tempted to jump to 1440p but if this means dropping to 60hz/60fps I am not sure.

So my question is -
Is this resolution jump really worth the smoothness sacrifice I am enjoying with my 100+ framerates?

Thanks in advance.
That's really a personal thing to decide and different people will give completely opposite answers. You really need to try it for yourself and determine which you prefer.

With LCD displays I've never used or seen anything other than 60Hz so I have no frame of reference beyond that, however I very much notice the benefits of higher resolution in every game that I can get to play at as high a resolution possible, preferably the native resolution of my display (2560x1600). With some games there are problems due to HUD not scaling, or the mouse pointer becoming too small and hard to find, or fonts too small, and so I need to use mods or tweak an INI file or something or else lower the resolution back down, but the overwhelming majority of games I play look the best to me at the highest possible resolution I can squeeze out. There is a tradeoff between that and game FPS though which I wont go below, and that varies from game to game also. My normal minimum is about 40FPS and if the frame rate drops below that, I'll lower graphic options or resolution to try to bring up the FPS. Some games I can get by with lower FPS especially if I enable motion-blur for example. But this is all talking about FPS below 60 due to GPU/CPU constraints rather than pushing things in the other direction.

For going higher than 60Hz on LCD I sadly have no frame of reference. With CRT displays I could personally see the difference in frame rate visually up to around 90Hz or so, and after that it was not so noticeable to me, but CRTs and LCD are very different in terms of how the refresh rate interacts with the human eye and persistence of vision etc. Personally I think I'd go for the native resolution of my display and as long as the FPS is 40-60+ I'd probably be happy, but then I haven't seen games on a 120Hz display either. I would imagine the extra FPS helps to greatly reduce input to eye lag also which could be handy in some games.

In the end though, someone else's preferences are not going to necessarily match what you prefer so you're going to have to test it out yourself. Thankfully that takes all of 30 seconds or so though. :)
I'm fully in the framerate vsync camp. I despise tearing. I think it's a testament to how fucked and disreputable the games "industry" is that it can release games with "optimization" problems and shitty framerates. It's developers not having any fucking restraint is what it is. Publishers playing them off against eachother is very real but I was beginning to take a dimmer view of the developer even before that article came out the other day. It's a farce that games come out broken. All games should at least be halfway stable on day 1 to the point that bugs are a rarity and they should all run at 60 fps. They should all manage 60 fps. How good they look should be a matter of how good they look at 60 fps. When other industries get up to this kind of shit there's lawsuits and watergate scandals and money lost. Games? Business as usual.

But I'm getting off way off topic. I don't like the "higher-res, lower framerate" crap, and like you I find myself jarred by the noise that it creates when you have to decide between "smooth and second-rate" (nVidia always has such a positive impact on videogames!) and a higher resolution which unlocks the full potential!

Anyway your question is incredibly subjective and light on actual substantive data. It's the kind of question that gets you attacked for being too vague. The type of games you play also matters. The best thing for you to do is hit up some electronics store where they have models on display, or a good model that can do it all, where you can compare high resolution to high framerate.
avatar
johnnygoging: <snip>
The type of games you play also matters.
<snip>
Indeed, that's another important thing I forgot to mention. Framerate in The Witcher Adventure game could be 10FPS and it wouldn't really even be noticed, whereas 10FPS in The Witcher 3 wouldl be brutal. Similar for point-n-click adventure games - framerate really wont matter since they're mostly a static screen, however due to that they probably can push 32423423 FPS anyway. :)
I've never had a monitor that goes above 1920x1080 60hz
Thanks guys.

The smoothness I am taking about is when I am playing first person shooters, when you pan and strafe. Even driving games when you are cornering. At 60fps you see the blurring even if it is 60fps vsynced.
At 100hz vsynced, this blur is hugely reduced. I have a top end Benq gaming monitor which is designed to reduce motion/panning blur but it is still there at 60hz on mine and everyones lcd/led monitor.

Basically, I know what I am going to see "Hz" wise, I just wanted to know, is the jump to 1440p was from 1080p good enough so I and accept the drop from 100+hz to 60hz.
Post edited February 20, 2016 by TheHoff
avatar
TheHoff: A few months ago I purchased a 6144hz 1080p monitor

Titan X

So my question is -
Is this resolution jump really worth the smoothness sacrifice I am enjoying with my 100+ framerates?
You should be using a multi monitor setup in that case since there is no single monitor solution.
avatar
johnnygoging: <snip>
The type of games you play also matters.
<snip>
avatar
skeletonbow: Indeed, that's another important thing I forgot to mention. Framerate in The Witcher Adventure game could be 10FPS and it wouldn't really even be noticed, whereas 10FPS in The Witcher 3 wouldl be brutal. Similar for point-n-click adventure games - framerate really wont matter since they're mostly a static screen, however due to that they probably can push 32423423 FPS anyway. :)
Also, I believe that some games (mostly ancient turn-based games) don't even constantly update the screen. For example, in a game like Nethack, if you don't enter any commands, and you don't resize the terminal window, the game does not update the screen; it just waits for input forever. In other words, the game is running at 0 FPS most of the time, but is still perfectly playable.

I think you might see this sort of situation with early Wizardry games (1-5, though 4 has some real time events), and with Might and Magic 1-2. It's only once you get into games that have constant animations (like MM3 and the Bard's Tale series) that you start having to update the screen regularly.
Only in the last couple years have i gotten used to somewhat higher resolution, by that i mean usually 1280x720. More often i get annoyed when resolution is too high, mostly because the demands it has to render it, not to mention some games have no idea how to manage making fonts larger so they get smaller and smaller and tinier and tinier until they are unreadable. I enjoy easily readable fonts that aren't too big or small.

Framerate... Higher the better, but honestly i think 60 is probably enough. True i'm not quite sure the difference of 60+ is, but there's a distinct difference when i play a game and then re-watch a recording at a significantly lower framerate than what i played and i'm blown away (Super Hexagon is a perfect example).

All in all, i played games at the really really low resolutions, NES or 320x200 resolutions, and if the gameplay is stable and enjoyable the resolution doesn't really matter (although scanlines look better than not having them, this is for the 16-bit consoles). But the higher the graphical push the more resolution you sorta need, i wouldn't dream of a 3D game played at less than 1024x768, but sprite games can go as low as they need to.

AA, Super Sampling and other effects take a back seat. Often i enjoy playing games without taxing my CPU or graphics card (last thing i need is loud fans because they are heating up a lot). Sure I'll enable them when i need to, but if i can do without them i will try. Vsync when appropriate, tearing is especially jarring and breaks all immersion when half the screen isn't in sync with what just happened, or tears in 2-4 locations.

Finally there's bit depth. Most games/systems run on 32-bit natively all the time anymore, but it wasn't that long ago that games ran 8-bit palettes, those made rendering, calculating and blitzing the screens at lightning speed to the monitors and graphics cards; Although resolution usually was capped at like 800x600. Recent years with heavy pushes for 3D removes 8-bit palette uses, but i never EVER had issues while playing with 256 colors. (Well there was once... but that had to do with the game crashing because there was like 3,000 enemies all got struck with fist of the heavens at once, and ate all my ram and took 15 minutes before the game fully crashed)
My opinion is that everyone sees this different, so it completely up to your own preferences if you prefer resolution over high frame rate.
avatar
TheHoff: ...
First world problems.

Here in the harsh north, I sometimes need to lower the resolution in some newer games to e.g. 1280x720 resolution (as well as disabling some graphics options like edge antialising etc.), just to ensure vsynced 30 fps. i am not sure if I am getting even that in e.g. The Witcher 3.

But i don't complain, I love whatever I get! I am just happy even for measly 30 fps even if it means I can't eat anything for the rest of the day.

So don't come telling me you had harsh times in your youth, I had it worse. Much Worse! When I was a kid, we'd just stare at a blank computer screen (0 fps), and loved it!
Post edited February 20, 2016 by timppu
I prefer higher fps. Gameplay smoothness comes first IMO.
It is really nice to play at a higher resolution and all, if that means less framerate, try to tweak the game settings to make sure that the framerates don't take the plunge.

Also, Titan X... YOU SMUG! :p
I have a couple old 1600 X 900 acer monitors. When I am working, I sometimes use a dual monitor setup, because it's useful. Not for gaming.

I play at the native resolution and I use vsync on games where my card tends to run away with things. For instance, Deus Ex will somtimes spike into the 1000s fps, and that's just generating heat.

I don't really want to get used to higher resolutions and over 60 frame rates, because I can't afford to cater to those desires. So, remaining happy with what I've got is paramount.

When I buy a new monitor it will likely be another 1600 X 900.
I have a 1440x900 display. I prefer rendering fewer pixels in order to extract as much detail and frames as I can. I don't notice frame rate drops like some people. I don't think I play the games that bother people with frame issues (rarely play FPS) so that may be a part of it. And I'm used to the blur; although, my TV is far worse than my monitor.

Anyway, if your current monitor is good and you're looking for ways to spend money, I would recommend donating some money to a local charity. You'll feel better than having lower frames and more resolution.
I never understood why anyone would have a higher resolution with a poorer performance. Its a little backwards. But that is just me. >_<