It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I have a few questions about monitors that I was hoping someone here at GOG might be able to help me with. I had a recent experience with Fallout 4 that made me realize just how important a setting Refresh Rate was and that has now brought more questions that I don't know the answer too. Here goes:

Assuming I have the proper cables (we'll make that assumption even if in practice I don't yet) and I have a monitor that says it's refresh rate is 144Hz, when I go into Nvidia control panel (or whatever control panel one uses) will the only options between 60Hz and 144Hz to show up likely to be... just 60Hz and 144Hz? In other words, if I thought my graphics cards was capable of a steady 100FPS, wouldn't it be better to have a refresh rate set at 100Hz? (100 frames per second and 100 refreshes per second would be like a "virtual" adaptive sync wouldn't it??

After all, that's what G-Sync and Free-Sync to... match refresh rate to FPS. They do it on the fly however, which results in better matches. But if you have an Nvidia card but a monitor with Free-Sync, wouldn't it be advantageous to set your refresh rate at whatever you think your card is capable of running 90+% of the time? and if that refresh rate is LESS than what the monitor is advertised at... will it likely show up as an option or are the only usually 60Hz and then whatever one it's advertised with nothing in between) My current monitor (crossover) has an advertised rate of 75Hz. And there is nothing in between that shows up... I either have to have it on 60 or 75. No 65. Or 70. But maybe that's just because there isn't really any "normal" rate between 60 and 75... I just don't know.

Also, how about a 21:9 monitor (3440X1440)? If you wanted to set it at a lower resolution to increase frame rates.. is it going to be another case of nothing in between 1920X1080 and 3440X1440) Again, my current monitor is advertised at 2560X1440... and there is nothing showing up as options between 2560X1440 and 1920X1080. But again, that may just be because there is no "normal" resolution between those two.. .I just don't know.

Basically what I'm saying is... if I want to aim for a monitor that worked at 2560X1080 or 2560X1440 at 75Hz or 100Hz, do I have to limit my choices to exactly that.. or is a monitor that is advertised as higher on either or both of those okay too as I will have the option to choose the lower settings in the Nvidia control panel????

Sorry for the length.. as usual I don't know how to ask briefly. Sorry.
acer predator 27" is cheaper now, just watch for light bleed. I had to return 3 monitors before I got a good one.

as much I dislike nvidia and their proprietary gsync it is working really well
Post edited March 10, 2018 by NovumZ
avatar
OldFatGuy: If you wanted to set it at a lower resolution
Don't. Not unless you play old DOS games that look pixelated anyway.
avatar
NovumZ: acer predator 27" is cheaper now, just watch for light bleed. I had to return 3 monitors before I got a good one.

as much I dislike nvidia and their proprietary gsync it is working really well
I returned like 8 of them and still had to settle with 1 dead pixel. The rest had more then 1, and one of them had a small hole in the top middle that was leaking light making it pretty distracting. Gsync is great. Acer's QC is fucking awful.
I have a 144hz 1440p Gsync monitor. Here are my options in the Nvidia control panel. Not sure if this is what you're asking for.

Edit: If you get a Gsync monitor the best thing about it is you can forget refresh rate for the most part. Unless it's a really old game and you're getting higher then 144hz then you might still get screen tearing. In that case, use RTSS to cap your framerate at something like 140. You can set it up to only cap certain games/apps and you can choose the fps cap on each i believe.

You'll still feel big dips when they happen like going from 100 fps to 70 fps for example but only when the dip happens. I always turn off Vsync cause the mouse sometimes becomes unusable in some games. Either in the menus or throughout the entire game. Also i believe Vsync introduces latency which sucks for multiplayer. One of the reasons i got a Gsync monitor. So i didn't have to choose between screen tearing or shitty mouse control in some games & latency.
Attachments:
Post edited March 10, 2018 by user deleted
avatar
Point_Man: I have a 144hz 1440p Gsync monitor. Here are my options in the Nvidia control panel. Not sure if this is what you're asking for.

Edit: If you get a Gsync monitor the best thing about it is you can forget refresh rate for the most part. Unless it's a really old game and you're getting higher then 144hz then you might still get screen tearing. In that case, use RTSS to cap your framerate at something like 140. You can set it up to only cap certain games/apps and you can choose the fps cap on each i believe.

You'll still feel big dips when they happen like going from 100 fps to 70 fps for example but only when the dip happens. I always turn off Vsync cause the mouse sometimes becomes unusable in some games. Either in the menus or throughout the entire game. Also i believe Vsync introduces latency which sucks for multiplayer. One of the reasons i got a Gsync monitor. So i didn't have to choose between screen tearing or shitty mouse control in some games & latency.
YES!!! Thank you that's exactly what I was wondering. It does appear you will have options "below" the advertised refresh rate and still above the standard 60. That's what I was hoping... it doesn't limit my options. I didn't seem to see too many options between the native 2560X1440 and the standard HD 1920X1080. But maybe there just aren't that many between those two resolutions.

See... I'm thinking if I saw a 3440X1440 at 144Hz (ultra wide monitor) that my graphics card (on this laptop) won't be able to crank that out at 144FPS (for most new games anyway). But I do think/hope that it would do better than 60FPS. Thus if I purchased such a monitor, I would either want to 1)drop the refresh rate to something I think my monitor could match (say 100) OR 2)drop the resolution so that my card could get up to 144FPS to match the 144Hz. If I could do neither, that rules out such a monitor.. which I don't want to do because I may find a good deal on one... and it would be good for my new rig (if I ever build the damn thing) as that should be a beast.
Post edited March 10, 2018 by OldFatGuy
avatar
Point_Man: I have a 144hz 1440p Gsync monitor. Here are my options in the Nvidia control panel. Not sure if this is what you're asking for.

Edit: If you get a Gsync monitor the best thing about it is you can forget refresh rate for the most part. Unless it's a really old game and you're getting higher then 144hz then you might still get screen tearing. In that case, use RTSS to cap your framerate at something like 140. You can set it up to only cap certain games/apps and you can choose the fps cap on each i believe.

You'll still feel big dips when they happen like going from 100 fps to 70 fps for example but only when the dip happens. I always turn off Vsync cause the mouse sometimes becomes unusable in some games. Either in the menus or throughout the entire game. Also i believe Vsync introduces latency which sucks for multiplayer. One of the reasons i got a Gsync monitor. So i didn't have to choose between screen tearing or shitty mouse control in some games & latency.
avatar
OldFatGuy: YES!!! Thank you that's exactly what I was wondering. It does appear you will have options "below" the advertised refresh rate and still above the standard 60. That's what I was hoping... it doesn't limit my options. I didn't seem to see too many options between the native 2560X1440 and the standard HD 1920X1080. But maybe there just aren't that many between those two resolutions.

See... I'm thinking if I saw a 3440X1440 at 144Hz (ultra wide monitor) that my graphics card (on this laptop) won't be able to crank that out at 144FPS (for most new games anyway). But I do think/hope that it would do better than 60FPS. Thus if I purchased such a monitor, I would either want to 1)drop the refresh rate to something I think my monitor could match (say 100) OR 2)drop the resolution so that my card could get up to 144FPS to match the 144Hz. If I could do neither, that rules out such a monitor.. which I don't want to do because I may find a good deal on one... and it would be good for my new rig (if I ever build the damn thing) as that should be a beast.
Does it have Gsync? Either way, why not keep the monitor on 144hz? It doesn't mean the computer must run the game at 144 fps all the time. It just means it can show up to 144 fps. My GPU is a little old at this point for the latest games, so i'm not usually running the latest games at max settings at 144 fps. For instance i've been playing Battlefield 1 and my fps fluctuates between 65 and 144. The Gsync makes it hard to notice fluctuations unless it's a big sudden dip like 110 to 70 for example.

Edit: Also you can set a custom resolution in the Nvidia control panel, but if it's not a standard resolution i'm not sure what type of problems could occur. I believe you can even use it to set a resolution higher then your current display, like 4K for your 1440p monitor. So you could run a game at 4K on your 1440p monitor and it should downscale it. That could have some benefits like less jaggies and other things i might not be aware of. I had 1440p running on my old 1080p monitor for a while because i wanted to see if some old games were gonna give me issues at 1440p or not before i got my 1440p monitor.
Attachments:
Post edited March 10, 2018 by user deleted
The options that show up for you are what your monitor supports. That's what they're aren't in - between settings.

To avoid trading without a g-sync monitor, just turn on vsync. That's your best bet, not manually configuring refresh rates.

Since you don't have g-sync, set it to 6 0 or 75 hz and leave it be.

If you do resolution, which I have to do sometimes, try it out and test if you like what see. Some resolutions look better on certain screens. And a pass or two of antialiasing might smooth it out well enough. But it will always look much better on native screen resolution if you can pull at least a decent frame rate for your eyes (in usually fine at 20-30 hz, but my eyes are a cheap date).
If you don't have a specific reason to do otherwise (ie for compatibility with an old game, it happened to me for Oni), you should ALWAYS set the highest refresh rate in the graphics driver control panel.

As for variable refresh rate, if you purchase a Freesync monitor with a NVIDIA GPU you have made a very poor choice imo :-P

With a G-Sync monitor, the suggested way to set up stuff is enabling V-Sync and G-Sync in NVIDIA control panel, disabling V-Sync in games. And my experience says it works pretty well.
avatar
Point_Man: I believe you can even use it to set a resolution higher then your current display, like 4K for your 1440p monitor. So you could run a game at 4K on your 1440p monitor and it should downscale it. That could have some benefits like less jaggies and other things i might not be aware of. I had 1440p running on my old 1080p monitor for a while because i wanted to see if some old games were gonna give me issues at 1440p or not before i got my 1440p monitor.
No TFT screen can show a higher resolution than it's native one. Software can't magically add physical leds (=native pixels) to your screen.
avatar
Point_Man: I believe you can even use it to set a resolution higher then your current display, like 4K for your 1440p monitor. So you could run a game at 4K on your 1440p monitor and it should downscale it. That could have some benefits like less jaggies and other things i might not be aware of. I had 1440p running on my old 1080p monitor for a while because i wanted to see if some old games were gonna give me issues at 1440p or not before i got my 1440p monitor.
avatar
teceem: No TFT screen can show a higher resolution than it's native one. Software can't magically add physical leds (=native pixels) to your screen.
I never said it would show a higher resolution. I said you could get some benefits from running it at a higher resolution and then downsampling. That can give some benefits like getting rid of jaggies so you don't have to use AA. If you don't know what downsampling is look it up.

Edit: Here's an article from Nvidia explaining the benefits of running at a resolution higher than what your monitor can show.
Post edited March 10, 2018 by user deleted
avatar
teceem: No TFT screen can show a higher resolution than it's native one. Software can't magically add physical leds (=native pixels) to your screen.
avatar
Point_Man: I never said it would show a higher resolution. I said you could get some benefits from running it at a higher resolution and then downsampling. That can give some benefits like getting rid of jaggies so you don't have to use AA. If you don't know what downsampling is look it up.

Edit: Here's an article from Nvidia explaining the benefits of running at a resolution higher than what your monitor can show.
You should've been more specific in the first place. DSR doesn't even work on older (pre-Maxwell) Nvidia cards. I don't know if ATI has something similar...
Also, there's probably little benefit if a game doesn't have higher resolution textures than the resolution DSR is downscaling to.
avatar
Point_Man: I never said it would show a higher resolution. I said you could get some benefits from running it at a higher resolution and then downsampling. That can give some benefits like getting rid of jaggies so you don't have to use AA. If you don't know what downsampling is look it up.

Edit: Here's an article from Nvidia explaining the benefits of running at a resolution higher than what your monitor can show.
avatar
teceem: You should've been more specific in the first place. DSR doesn't even work on older (pre-Maxwell) Nvidia cards. I don't know if ATI has something similar...
Also, there's probably little benefit if a game doesn't have higher resolution textures than the resolution DSR is downscaling to.
It was an option i pointed out to him and not really the point of my post. Don't know why you're making a big deal out of it. I also used the word downscale/downsample, and mentioned one of the possible benefits (getting rid of jaggies). I think that's clear enough. I have a PS4 Pro and it also downsamples from it's 4K mode to 1080p if the developer set it up to work properly i believe, pretty sure the Xbox One X does it too. They both use AMD gpus, So downsampling isn't some obscure thing.
Post edited March 10, 2018 by user deleted
avatar
Point_Man: It was an option i pointed out to him and not really the point of my post. Don't know why you're making a big deal out of it. I also used the word downscale/downsample, and mentioned one of the possible benefits (getting rid of jaggies). I think that's clear enough.
Don't worry about it, it's no big deal. I've noticed that before - people taking me far too seriously, just because I'm direct and to the point. These forums could really use some smilies/emoticons/whatever-they're-called-today!

Anyway, I'm glad you mentioned DSR eventually. I've been meaning to experiment with it since I got my GTX970, but forgot about it. Now to find Games that could benefit from it... I'm afraid I have few. And some of those (like The Witcher 3) will take too much of a performance hit. :-(

Windows (10) itself doesn't look too great with DSR; blocky or fuzzy, depending on the smoothing setting.
avatar
teceem: Don't worry about it, it's no big deal. I've noticed that before - people taking me far too seriously, just because I'm direct and to the point. These forums could really use some smilies/emoticons/whatever-they're-called-today!

Anyway, I'm glad you mentioned DSR eventually. I've been meaning to experiment with it since I got my GTX970, but forgot about it. Now to find Games that could benefit from it... I'm afraid I have few. And some of those (like The Witcher 3) will take too much of a performance hit. :-(

Windows (10) itself doesn't look too great with DSR; blocky or fuzzy, depending on the smoothing setting.
I never heard of this before today... lol. I'm interested in giving it a try on my 980 too. Although not with newer games as I don't want the performance hit. I want to see what it might do for some older games (say mid 2005 to 2015 time frame) as I'm thinking they may be the perfect specimens... support higher resolutions and a performance hit won't matter much as the 980 handles those without a sweat anyway.

Only problem is I don't know how to do it because I didn't install (and don't want to install) GeForce Experience. I used to have that installed and all it did was freeze, crash, just cause all sorts of problems on this laptop. Maybe you can't use it without GeForce Experience??? I dunno, just know that in that linked article it said the way to turn on it was in GeForce Experience.