It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Or, to put it another way, what's the highest framerate that you consider playable?

Or, I could ask:
* At what framerate/frametime is it reasonable for a game engine to start sacrificing aesthetically to try to keep the framerate/framttime reasonable? (For example, if enemies explode into harmless particles, at what point should those particles stop appearing, or when would it make sense for the game to automatically lower graphics settings temporarily?)
* At what framerate/frametime is it reasonable for the game to start sacrificing the integrity of the game mechanics? For example, at what point is it reasonable for a game to do things like skipping collision checks, or to not allow new enemies to spawn or bullets to be shot, in order to try to bring things back to a reasonable level?
* At what point should the game just give up?
avatar
dtgreene: Or, to put it another way, what's the highest framerate that you consider playable?

Or, I could ask:
* At what framerate/frametime is it reasonable for a game engine to start sacrificing aesthetically to try to keep the framerate/framttime reasonable? (For example, if enemies explode into harmless particles, at what point should those particles stop appearing, or when would it make sense for the game to automatically lower graphics settings temporarily?)
* At what framerate/frametime is it reasonable for the game to start sacrificing the integrity of the game mechanics? For example, at what point is it reasonable for a game to do things like skipping collision checks, or to not allow new enemies to spawn or bullets to be shot, in order to try to bring things back to a reasonable level?
* At what point should the game just give up?
Depends on the game, but a general rule of thumb for me is:

60 without G-Sync
50 with G-Sync enabled (some dips into the very high 40s may be okay depending on the game)

On the upper end, obviously there isn't a maximum framerate that I consider playable, although my monitor caps out at 144hz on its current settings.

Games that dynamically change graphics settings to target a framerate are okay, although I'm personally not a fan (I'd rather experiment, set the graphics settings myself and then let g-sync deal with any variation).

Turning off game mechanics to deal with poor optimisation or hardware is (in my book) lazy programming. Optimise better, design better, or set more realistic minimum specs.
It depends highly on the game, as an example I enjoyed playing Diablo for years despite its rendering engine being capped at 20 Hz (this is no longer the case when played with the DevilutionX modern engine). And of course for turn-based games with little to no animations (think of things like Civilization 2 or Heroes of Might and Magic 2), framerate is barely a concern.

On the other hand I do have an upper limit: I do not care for refresh rates higher than 75 Hz, so if it is configurable for a given game I will cap the rendering at 75 Hz to lessen the hardware load.

---

I don’t like dynamic graphics settings based on current framerate, I would rather set the graphics settings low enough to ensure I get consistent graphics quality.

---

Game mechanics should not be tied to the rendering. Never.
This is always leading to problems when the game is played a decade in the future.
While I strive for 50-60, I'll accept as low as 30 depending on the game.
Post edited February 05, 2023 by syscrusher
I played Crysis at sub-30fps with medium graphics at 1080p. Saints Row 2 plays like mud but I still find it acceptable as long as it's not stuttering and chugging, though I don't even think the beefiest machines can fix it anyway. I like my games to look decent, but I'm willing to throw the graphics lower as long as it's not outright ugly. As long as the performance allows cohesive gameplay, I can tolerate a low framerate, maybe 15fps at minimum since that's what a lot of N64 and PS1 games ran at. I don't think my machine can even do 60fps to be honest unless it's well optimized or otherwise butt ugly. Then again, maybe I should try playing games at butt ugly mode to see how high the framerate can get.

EDIT: After having played about half an hour of Crysis Warhead just to see what I find "acceptable", I fiddled with the settings as shown. Had to turn off AA though. Most stuff on "mainstream", some things I don't care for like motion blur is off, others at minimum, some at gamer just so it doesn't look muddy.
Attachments:
Post edited February 05, 2023 by Warloch_Ahead
It depends on the game. City Skylines is playable for most people up to the mid to high 20s. For games that require high reflexes (like shooters), those you'll want as high an FPS as you can get.

If you're playing older games, you don't have many options - as increasing framerate is either not possible or would break the game in some way.
60 locked when using a mouse. Even dips to 55 are noticeable to me with a mouse, despite adaptive sync.

If using a controller I'm more lenient to dips into the 50s. Honestly 30fps with a controller for a non-shooter is something I can eventually adapt to if I have to. I did that with the last Zelda on the Switch when my friend loaned me his system. Could never do that with a mouse though, it feels awful.
avatar
pds41: Games that dynamically change graphics settings to target a framerate are okay, although I'm personally not a fan (I'd rather experiment, set the graphics settings myself and then let g-sync deal with any variation).

Turning off game mechanics to deal with poor optimisation or hardware is (in my book) lazy programming. Optimise better, design better, or set more realistic minimum specs.
I'm thinking of not so much targeting a framerate, but rather the engine taking desperate measures in an emergency situation, like if the frame time is measured in seconds per frame.

One potentially problematic situation is if, due to certain things happening in game, the game lags so badly that each frame takes minutes. At this point, the player is frantically pressing the escape key and hoping the game will respond, but then gets frustrated with waiting and then force closes the game, which is not a good experience for the player.

Perhaps, simply due to circumstances, the game just simply stops creating objects that would slow down the game, like the explosions that happen when an enemy gets shot, splitting into many pieces that then would have to be rendered. Or, perhaps, the game just doesn't render objects that are just cosmetic. (Failing to render objects that matter for gameplay can cause issues where the player gets hit by an invisible projectile that should be visible, for example.)

One case where a game compromises game mechanics is in Donkey Kong 64. When the game lags, the game has the player move faster (per frame) to compensate, and that can break collision. To see this in action, watch speedruns of the game (but not the virtual console release, as that version doesn't lag the way the original release does).
avatar
vv221: Game mechanics should not be tied to the rendering. Never.
This is always leading to problems when the game is played a decade in the future.
If the game is played well after its original release, the framerate isn't going to be low enough for the low framerate to be causing issues. If there is an issue, the issue will be the framerate being too high, not too low.

By the way, modern game engines often set a physics framerate, which is separate from the graphics framerate and which the engine tries to keep consistent. The Godot engine does this, for example. As another example, Hollow Knight apparently runs at a physics framerate of 50 FPS, regardless of the graphics framerate the game is running at.
avatar
StingingVelvet: 60 locked when using a mouse. Even dips to 55 are noticeable to me with a mouse, despite adaptive sync.

If using a controller I'm more lenient to dips into the 50s. Honestly 30fps with a controller for a non-shooter is something I can eventually adapt to if I have to. I did that with the last Zelda on the Switch when my friend loaned me his system. Could never do that with a mouse though, it feels awful.
Do you need the constant 60 FPS when playing a turn-based game that uses the mouse to select commands (and not for camera control or aiming)?

Do you need the constant 60 FPS when in the main menu, before you start a game or load your save?
Post edited February 05, 2023 by dtgreene
This strictly depends on the game type.
I have seen 90s adventures working well with 10 frames.
2D platformers are ok with 50+
Games with slow moving camera are ok with 30, but 60 are better. I'm currently playing Beyond A Steel Sky locket at 60, which is perfectly ok.

For most 3D games I prefer something over 80+ however.
Shrug. I used to play N64 games without even blinking twice, but I feel like once it creeps below 48, I start to notice, and anything below 30 is gonna feel pretty damn chunky. I've been playing Dwarf Fortress a lot, and there's a slow slippage of FPS to various factors that you will feel as the game goes on.

I mean, then you've got largely static games such as chess and RPGs, where as long as they accept commands at a high polling rate, they could move at 12 frames for all I care.

...And many of them did, and nobody ever noticed in Final Fantasy 7-9!
I'm used to dealing with low framerates due to the computers I had always having the videocard as the bottleneck (and this one not having a dedicated one at all). That's of course not ideal in action games, but I played the 2004 Bard's Tale at a few fps and it might have even helped in that one, giving me time to "process" the situation and being able to time clicks into moments that'd have been fractions of a second at any passable framerate. Either way, it's not something I pay particular attention to, but I'd say that I'd generally take a better graphics quality over going past 30 fps or so, and can usually deal with 20 and in many cases even 15 fps without really complaining.
Really don't think a game should adjust graphics quality dynamically. And definitely not change game mechanics for it.
That's for any regular scenario, at least. In those extreme ones you just mentioned now, with a frame taking minutes to render, yeah, in that case it should probably skip to the "give up" option, realize it's unplayable on the system and stop to allow quitting, though if for some reason (I guess for the purpose of screenshots or some extreme sort of testing?) the player does want it to continue trying to play, the option should be offered.
60 Hz.

I have a GeForce RTX 3080, I had a FHD 144Hz monitor until January 2023 and now a beautiful 1440p, 240Hz and G-Sync Ultimate display from Alienware.


Everything that goes below 60Hz (like Elden Ring while fighting dragons) infuriates me. I should get a new GeForce RTX 4070 Ti to avoid that risk asap :-D
I think, as has been mentioned, it definitely comes down to the type of game. I grew up during the time when 30 fps seemed like the standard, so it's never been an issue for me.
avatar
dtgreene: Or, to put it another way, what's the highest framerate that you consider playable?

Or, I could ask:
* At what framerate/frametime is it reasonable for a game engine to start sacrificing aesthetically to try to keep the framerate/framttime reasonable? (For example, if enemies explode into harmless particles, at what point should those particles stop appearing, or when would it make sense for the game to automatically lower graphics settings temporarily?)
* At what framerate/frametime is it reasonable for the game to start sacrificing the integrity of the game mechanics? For example, at what point is it reasonable for a game to do things like skipping collision checks, or to not allow new enemies to spawn or bullets to be shot, in order to try to bring things back to a reasonable level?
* At what point should the game just give up?
It depends on the game, the frame time or pacing, and other factors. 15-20 is the lowest I'll play a game on, and it has to be a slow moving game like a RPG or a Point and click. 30 FPS for pretty much anything else as long as it has good frame pacing, though 60 is preferred.
avatar
dtgreene: Or, to put it another way, what's the highest framerate that you consider playable?

Or, I could ask:
* At what framerate/frametime is it reasonable for a game engine to start sacrificing aesthetically to try to keep the framerate/framttime reasonable? (For example, if enemies explode into harmless particles, at what point should those particles stop appearing, or when would it make sense for the game to automatically lower graphics settings temporarily?)
* At what framerate/frametime is it reasonable for the game to start sacrificing the integrity of the game mechanics? For example, at what point is it reasonable for a game to do things like skipping collision checks, or to not allow new enemies to spawn or bullets to be shot, in order to try to bring things back to a reasonable level?
* At what point should the game just give up?
avatar
paladin181: It depends on the game, the frame time or pacing, and other factors. 15-20 is the lowest I'll play a game on, and it has to be a slow moving game like a RPG or a Point and click. 30 FPS for pretty much anything else as long as it has good frame pacing, though 60 is preferred.
Frame pacing?