It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I am all out of synch, guess I will have to buy more.

Jokes aside, I almost never use it but wish I had a Free-Synch capable monitor. Then I would turn that adaptive synch on.
avatar
jonridan: I didn't like my laptop overheating so much paying Wasteland 2 DC, so I disabled vsync (60hz and 60fps...) and capped the FPS to 30 (now I have almost half the temperatures I had before, and the game still runs beautifullyl). Planning on doing the same thing when I get around to playing Pillars of Eternity (and buying it...). Luckily all this games don't really benefit gameplay-wise about "moar fps". And though it is nice to see the fluidity of 60, cutting from 95 to 55/60 degrees celsius on a laptop, I think is worth it...

Anyway, it depends on the game: simpler games that do not overheat the GPU as much I just leave it on so as to not really go above and beyond 60fps without being able to even notice the difference on my display.
Actually getting V-sync on lowers the temperatures in most cases because it prevents the fps from going above your display refresh rate. From the temperature point of view, if you limit your fps to 30 via game options, it doesn't matter whether you have V-sync on or off. Of course there's still the input lag case.
The display server I rely on doesn't care about such pithy things as V-Sync.

Behold.
Always. It runs without issue on all of my rigs.
avatar
jonridan: I didn't like my laptop overheating so much paying Wasteland 2 DC, so I disabled vsync (60hz and 60fps...) and capped the FPS to 30 (now I have almost half the temperatures I had before, and the game still runs beautifullyl). Planning on doing the same thing when I get around to playing Pillars of Eternity (and buying it...). Luckily all this games don't really benefit gameplay-wise about "moar fps". And though it is nice to see the fluidity of 60, cutting from 95 to 55/60 degrees celsius on a laptop, I think is worth it...

Anyway, it depends on the game: simpler games that do not overheat the GPU as much I just leave it on so as to not really go above and beyond 60fps without being able to even notice the difference on my display.
avatar
Sarafan: Actually getting V-sync on lowers the temperatures in most cases because it prevents the fps from going above your display refresh rate. From the temperature point of view, if you limit your fps to 30 via game options, it doesn't matter whether you have V-sync on or off. Of course there's still the input lag case.
Mine is a pretty specific scenario. The game in particular allows to cap the frames to the value you want, but v-sync has to be disabled for that, otherwise it goes up to the max your display can show. I know the temperatures are lower because it's half the framerate, but I couldn't do that without disabling it. Hence the relationship between hertz, fps, and temperature. Maybe there are programs that allow you to limit these options but I don't know of any.
avatar
kohlrak: Some games do need the higher FPS. I'm curious, do you have a universal way of doing this? I have some games i'm sure would run much better for me if i was able to set a static framerate for them instead of them bouncing all over the place.
avatar
jonridan: Wasteland 2 DC gives you the option once you disable v-sync (is right under it in the graphics settings). Pillars of Eternity I believe allows you too... but I don't have a universal way or anything. I'm really at the mercy of the developers haha.
Ah, i was wondering if you knew of a way to enable frameskipping for programs that don't use it.

As for the people claiming v-sync ELIMINATES tearing, not really. It just stops your GPU from going above the fps your display can show... If your PC can't handle FPS of 50 or above, chances are that 50 or more hertz on your display will create tearing. It happened to me playing Tomb Raider Underworld way back when it released. I could play it above 30 fps, maybe 40/45... but I was not reaching 60fps... my display was 60hz. Screen tearing galore was the result, specially in the cutscenes.
Depends on your drivers and your card. It does generally eliminate the tearing. I wrote my own graphics drivers for my toy OS, and, of course, the most important thing that had the most impact was vsync. Once i got vsync, everything stopped tearing. Without it, the screen was tearing.

I do have to ask when this was, though. There were a few remarkable periods in recent history where vsync was known to be bugged in ATI and nVidia drivers. It was particularly bad for minecraft and a few other games, IIRC, and as a result they suggested locking it to 60fps manually for those games as a result.

EDIT: I wonder if the issue was caused by the same issue i was having with my drivers. I had to poll the vsync pin (wasting alot of cycles) because i wasn't able to make the card trigger an IRQ for vsync. There are standards for that, but it looked to me like it got neglected as i wasn't able to actually trigger an IRQ, which was a real shame.
Post edited May 20, 2021 by kohlrak
avatar
jonridan: As for the people claiming v-sync ELIMINATES tearing, not really. It just stops your GPU from going above the fps your display can show... If your PC can't handle FPS of 50 or above, chances are that 50 or more hertz on your display will create tearing. It happened to me playing Tomb Raider Underworld way back when it released. I could play it above 30 fps, maybe 40/45... but I was not reaching 60fps... my display was 60hz. Screen tearing galore was the result, specially in the cutscenes.
Then vsync was bugged and not functioning, or you were not using it and do not understand how it works. You cannot have 40/45 fps with a 60Hz monitor and vsync on. If the computer can't keep up with 60fps in that case, it goes down to the nearest even divisor, which is 30fps. If it can't even do that, then 20fps, then 15fps. It's physically impossible to have tearing, because vsync waits for the display to finish updating before sending another frame.
avatar
jonridan: As for the people claiming v-sync ELIMINATES tearing, not really. It just stops your GPU from going above the fps your display can show... If your PC can't handle FPS of 50 or above, chances are that 50 or more hertz on your display will create tearing. It happened to me playing Tomb Raider Underworld way back when it released. I could play it above 30 fps, maybe 40/45... but I was not reaching 60fps... my display was 60hz. Screen tearing galore was the result, specially in the cutscenes.
avatar
eric5h5: Then vsync was bugged and not functioning, or you were not using it and do not understand how it works. You cannot have 40/45 fps with a 60Hz monitor and vsync on. If the computer can't keep up with 60fps in that case, it goes down to the nearest even divisor, which is 30fps. If it can't even do that, then 20fps, then 15fps. It's physically impossible to have tearing, because vsync waits for the display to finish updating before sending another frame.
I've actually seen people use timers for vsync instead of proper vsync so i'm going to go out on a limb and say it wasn't him in particular.
With my last monitor and my last OS (Win7) I had to use v-sync quite often to avoid screen tearing sice I did not want to use aero and nearly every Unity based game was causing screen tearing (no matter the frame rate!) without areo. Now I've changed to Win10 and also a 144hz g-sync copmpatible monitor and now I am usually turning v-sync off and if there is no screen tearing visible (which mostly is the case) I leave it this way. Screen tearing for me will always be the bigger issue than stuttering/lagging - I just can't stand it.
Post edited May 20, 2021 by MarkoH01
avatar
Sarafan: What about you? Do you use V-sync? What is your experience with using this option?
I usually try to enable it because I dislike the idea of the GPU (and/or CPU) constantly running hot at 100% without any good reason, especially when playing on a laptop. The best example of this was when I tried playing Quake with some of those newer game engine replacements, and I heard my laptop fans starting to scream at full speed.

I found it odd as it is an old game that the laptop should handle easily, and it turned out that because vsync was off, the game was calculating something like 2000 frames per second (checked with the game's internal FPS counter), even though of course it could display only 60 frames per second on the laptop screen.

So I enabled vsync from the game options, and boom, the laptop became instantly quiet and started running much cooler, as the game was running at constant 60 fps instead of running at something insane like 2000 fps, which the screen couldn't display anyway. That was the time I made the decision I try to enable vsync whenever I can, unless there is some other option to restrict the game speed (like some other FPS limiter within the game, that does not rely on vsync). At least Quake (with the new engine) seemed to run just fine with vsync on, at constant 60 fps.

I keep hearing about the "vsync lag", and I can't really say I've noticed it, but then maybe I haven't tested it closely enough. I don't quite understand it either, like:

- How exactly does vsync cause input lag? To me those two things sound unrelated, so I am trying to understand how they are connected.

- Does this input lag occur an all games where vsync can be enabled, and if not, why not?

- How big is the input lag anyway, or does it depend on a game (severe on some games, unnoticeable on others...)?

Like when playing that Quake game in vsynced 60 fps instead of "calculated" 2000 fps with vsync off, I don't recall thinking "Wow, the input is so laggy now, near unplayable" when playing with vsync on. As far as I could tell, there was no difference, the game ran smooth and controls were fine regardless of vsync off or on. The only real difference was that the laptop ran much cooler and quieter with vsync on.
Post edited May 20, 2021 by timppu
And yeah I hope my next gaming laptop has either gsync or freesync or whatever is relevant so that I don't have to care about vsync anymore and get the benefits of both vsync on and off (like that the GPU/CPU is not unneededly running at 100% all the time, getting constant 60fps with no input lag etc.).

However, the gsync/freesync options seem to be limited, maybe if those two are two competing "standards". Then again I might usually use an external monitor (or TV) for gaming even with a laptop, so maybe it is enough that the external display has the relevant support.

I don't recall if my 65" LG OLED TV has gsync or freesync support, I seem to recall it should? At least the PC monitor I bought over a year ago on sale does not, it is just a common 60Hz computer monitor.
Post edited May 20, 2021 by timppu
avatar
timppu: - How exactly does vsync cause input lag? To me those two things sound unrelated, so I am trying to understand how they are connected.
This is what I've found: "V-Sync adds input lag because it delays frames from being shown on screen, making the time between when you do something and when it appears on screen longer."

avatar
timppu: - Does this input lag occur an all games where vsync can be enabled, and if not, why not?
After my tests I suspect that almost every game introduces at least a very small input lag. In can be barely noticeable or not noticeable at all, but it's there.

avatar
timppu: - How big is the input lag anyway, or does it depend on a game (severe on some games, unnoticeable on others...)
It depends on the game and hardware I presume. For example, in The Witcher 3 I had no visible input lag. In Cyberpunk 2077 input lag only affected mouse cursor. Looking around with player character didn't cause any visible lag. Another example is Doom Eternal. In this case input lag affected everything from mouse cursor in the menu to looking around with character. I know there are people however who didn't have a noticeable input lag in Doom Eternal even with V-sync on.

avatar
timppu: Like when playing that Quake game in vsynced 60 fps instead of "calculated" 2000 fps with vsync off, I don't recall thinking "Wow, the input is so laggy now, near unplayable" when playing with vsync on. As far as I could tell, there was no difference, the game ran smooth and controls were fine regardless of vsync off or on. The only real difference was that the laptop ran much cooler and quieter with vsync on.
I played Quake 1 on QuakeSpasm sourceport not so long ago. I didn't notice any visible input lag as well.
Yes and no. I have a G-SYNC monitor (a native one, not one of those "compatible" fakes out there :-P), so I use the special sync mode managed through NVIDIA drivers.

In brief: V-SYNC in NVIDIA Panel is ON in G-SYNC mode, V-SYNC in games is always OFF. This way my GeForce RTX 3080 is always synchronized with my G-SYNC (AOC Full HD) monitor.

And yeah, it's a real game-changer in gaming (in addition to using a 144Hz monitor, that's it).
Post edited May 20, 2021 by KingofGnG
Interesting discussion. I do all my gaming in Linux, with a GTX 1050 and a cheap old 1080p monitor. At first I remember having a lot of issues with screen tearing, but I enabled a setting in the NVidia options called 'force full composition pipeline' and that immediately cured all my tearing problems. Since then, I haven't noticed any problems with tearing at all, regardless of V-sync. In fact, I don't even know whether it is on or off for most games - I tend to just leave it at the default.

My impression is that setting has something to do with the compositing in the X Window System, although I'm not exactly sure on how that fits in with V-sync. Perhaps the compositing/rendering works differently in Linux to Windows?
Post edited May 21, 2021 by Time4Tea
I keep it on.

I generally play older games that, don't really support high framerates, weren't designed for low latency input and desperately need frame rate limiting so my GPU doesn't overload itself trying to produce 32000 frames per second.

My answer may change eventually if I get a G-Sync/Freesync monitor, but I'm still happy with my 10 year old U2711.