Posted March 10, 2018
I have a few questions about monitors that I was hoping someone here at GOG might be able to help me with. I had a recent experience with Fallout 4 that made me realize just how important a setting Refresh Rate was and that has now brought more questions that I don't know the answer too. Here goes:
Assuming I have the proper cables (we'll make that assumption even if in practice I don't yet) and I have a monitor that says it's refresh rate is 144Hz, when I go into Nvidia control panel (or whatever control panel one uses) will the only options between 60Hz and 144Hz to show up likely to be... just 60Hz and 144Hz? In other words, if I thought my graphics cards was capable of a steady 100FPS, wouldn't it be better to have a refresh rate set at 100Hz? (100 frames per second and 100 refreshes per second would be like a "virtual" adaptive sync wouldn't it??
After all, that's what G-Sync and Free-Sync to... match refresh rate to FPS. They do it on the fly however, which results in better matches. But if you have an Nvidia card but a monitor with Free-Sync, wouldn't it be advantageous to set your refresh rate at whatever you think your card is capable of running 90+% of the time? and if that refresh rate is LESS than what the monitor is advertised at... will it likely show up as an option or are the only usually 60Hz and then whatever one it's advertised with nothing in between) My current monitor (crossover) has an advertised rate of 75Hz. And there is nothing in between that shows up... I either have to have it on 60 or 75. No 65. Or 70. But maybe that's just because there isn't really any "normal" rate between 60 and 75... I just don't know.
Also, how about a 21:9 monitor (3440X1440)? If you wanted to set it at a lower resolution to increase frame rates.. is it going to be another case of nothing in between 1920X1080 and 3440X1440) Again, my current monitor is advertised at 2560X1440... and there is nothing showing up as options between 2560X1440 and 1920X1080. But again, that may just be because there is no "normal" resolution between those two.. .I just don't know.
Basically what I'm saying is... if I want to aim for a monitor that worked at 2560X1080 or 2560X1440 at 75Hz or 100Hz, do I have to limit my choices to exactly that.. or is a monitor that is advertised as higher on either or both of those okay too as I will have the option to choose the lower settings in the Nvidia control panel????
Sorry for the length.. as usual I don't know how to ask briefly. Sorry.
Assuming I have the proper cables (we'll make that assumption even if in practice I don't yet) and I have a monitor that says it's refresh rate is 144Hz, when I go into Nvidia control panel (or whatever control panel one uses) will the only options between 60Hz and 144Hz to show up likely to be... just 60Hz and 144Hz? In other words, if I thought my graphics cards was capable of a steady 100FPS, wouldn't it be better to have a refresh rate set at 100Hz? (100 frames per second and 100 refreshes per second would be like a "virtual" adaptive sync wouldn't it??
After all, that's what G-Sync and Free-Sync to... match refresh rate to FPS. They do it on the fly however, which results in better matches. But if you have an Nvidia card but a monitor with Free-Sync, wouldn't it be advantageous to set your refresh rate at whatever you think your card is capable of running 90+% of the time? and if that refresh rate is LESS than what the monitor is advertised at... will it likely show up as an option or are the only usually 60Hz and then whatever one it's advertised with nothing in between) My current monitor (crossover) has an advertised rate of 75Hz. And there is nothing in between that shows up... I either have to have it on 60 or 75. No 65. Or 70. But maybe that's just because there isn't really any "normal" rate between 60 and 75... I just don't know.
Also, how about a 21:9 monitor (3440X1440)? If you wanted to set it at a lower resolution to increase frame rates.. is it going to be another case of nothing in between 1920X1080 and 3440X1440) Again, my current monitor is advertised at 2560X1440... and there is nothing showing up as options between 2560X1440 and 1920X1080. But again, that may just be because there is no "normal" resolution between those two.. .I just don't know.
Basically what I'm saying is... if I want to aim for a monitor that worked at 2560X1080 or 2560X1440 at 75Hz or 100Hz, do I have to limit my choices to exactly that.. or is a monitor that is advertised as higher on either or both of those okay too as I will have the option to choose the lower settings in the Nvidia control panel????
Sorry for the length.. as usual I don't know how to ask briefly. Sorry.