It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
First time I heard this kind of crap was when people tried to tell me The Hobbit movies running at 48fps was nothing but a cheap tactic to get people viewing. Those movies looked magnificent at high frames per second and I have disliked watching the more primitive movies since.
avatar
Ricky_Bobby: Not necessary. I wrote previously that I can spot the difference when it comes to certain games.
It was more of a general remark to everyone who would like to try for themselves if they can actually spot the difference. Mind you, I guess a warning is good. If low framerates are ok, seeking higher is not ideal if your hardware won't be able to deliver. Not wise to increase one's standards unnecessarily, that's why I'm avoiding 120FPS/120Hertz monitors because I think I would spot the difference but then it would ruin 60 and it will start to bother me.

avatar
JMich: Question, since you do seem to have the most discerning eyes: Do you also notice the difference between 30 and 60 fps on 4X TBS and/or adventure games, or only on games that require you to focus on them a bit more? Then again, not sure how many TBS and/or adventure games support 60 fps.
Depends on game engine and how fluid the animations are. HOMM3 for example runs at 10FPS I believe but all animations are so rigid it doesn't matter. I've also played that game since I was 11, and it's good enough framerates doesn't detract too much.

Something like Sins of Solar Empire, Age of Wonders 3 or perhaps the new Master of Orion I believe I would be more bothered by 30FPS but not as much as a shooter, platformer or RPG.

A point & click adventure would also probably (in general) require less frames considering how nothing happens unless you cause an action or perhaps hold your mouse on something that causes a small animation change.
avatar
Spectre: Which ironically has a more technologically advanced control system by default than the PC.
avatar
Darvond: Okay, so adding a touch pad to the controller that has barely moved in 20 years, that's innovative?

Charging through the nose so you can do what custom kitters have been doing for years is innovation?

Please, do tell how the Steam Controller, the first actual paradigm shift in ages is less advanced than those.
Mouse and keyboard is default for PC with their inbuilt flaws.
Steampad isn't the default control for PC and also it has just come out where as the Wii was out in 2006 and WiiU in 2012.
avatar
afarrah20: So I see a lot of this "human eye can't detect past 30fps" rubbish online and sometimes "it can't see past 60fps".
Where did that all start and why do people continue to say it when it isn't true? What scientific evidence are they basing their findings on? Or is it a case of "it was on the internet...it must be true Hallelujah!"?
With natural blur in video and movies, the framerate we've gotten used to at 24 over years and years has made us adapt to it mentally most likely. But once you are exposed to a higher framerate long enough, you adapt to that instead.

Honestly watching higher framerates while at the store just gives me a headache, not because it's hard on my eyes, but because it looks so surreal, unnatural, also could be timing problems based on 60 refreshes vs a lower number and when they try to force it higher there's frames that are too fast, and then longer frames that make every second feel like a big mess of timing. I'd rather watch something at 24-30fps than with that weird delays.

To note VSauce, they said the eye's max we can see is about 120. I'm not sure if the video i posted is where they explore that or if it's the [url=https://www.youtube.com/watch?v=4I5Q3UXkGd0]resolution of the eye
avatar
darthspudius: First time I heard this kind of crap was when people tried to tell me The Hobbit movies running at 48fps was nothing but a cheap tactic to get people viewing. Those movies looked magnificent at high frames per second and I have disliked watching the more primitive movies since.
I too saw the first Hobbit film in the cinema in 3D and at 48fps and it looked absolutely dreadful. I hate 3D at the best of times, but the high frame rate meant that everything - the CGI in particular - looked horribly artificial. It actually had the same unpleasant sensation that the frame interpolation that seems to be set by default on many modern TVs gives you. I have the 2D Extended Edition on Bluray and I find it so much more pleasant to watch.

The standard 24fps that most films use helps with the suspension of disbelief that many films require nowadays, and the same applies to gaming. Sad as it is, in gaming, high frame rates and higher resolutions tend to do a brilliant job of exposing the flaws in 3D model assets. Quake looked brilliant back in 1996 at a resolution of 640x480. In fact, it still has a certain appeal even today on modern hardware running at the same resolution. But crank it up to 1920x1080 and you'll notice all the jagged edges of the models and environment all the more. And at high frame rates, 3D model animation comes across as more artificial and very stiff - the brain instinctively does a better job of interpolating the animation frames than a computer does. The less detail you feed the brain, the more realistic an image the imagination can form.

As for the 30fps vs. 60fps debate - people shouldn't confuse "don't notice" with "don't care". I of course prefer 60fps, but I honestly prefer a robust 30fps with a good level of visual detail over a silky smooth refresh, but I'm not much of a twitch gamer beyond Street Fighter 4.
Has anyone that claims to be able to notice a big difference done the experiment 'blind'? Placebo effect is very real, so if you know there should be a difference and expect one to look better than the other it's very likely you could perceive such a difference even when it isn't there.

A double blind experiment would be difficult, but it should be simple enough to set up a single blind. Have a friend set the fps, test and check which fps you think it is. Repeat a dozen or so times, and limit communication during / between tests so that you're not sharing clues unintentionally.
avatar
Darvond: The problem being is that consoles still exist in spite of us being in a world where consoles are totally pointless, being closed garden PCs, at best.

Or a low quality toy in the case of most Nintendo products these days.
Yes, it is definitely a problem. Problem for video game developers and a problem for consumers. I'm not stating that there is no problem, but rather that there very much is a problem and my premise is that 30FPS isn't a good solution. It may turn out to be the solution chosen by a developer, or even the only solution possible without a massive undertaking but the lack of options doesn't make that one a good one.

What I wonder is what happens in 1/2/5 years as newer and newer games come out that further push what is possible in gaming on the PC platform both with and without VR. As games use more and more CPU and GPU power, and likely FPGA power in years to come they'll become ever more hungry for resources. If older consoles are any sign of a pattern in the console market they've shown that they tend to come out and last for quite a long time in the marketplace, kind of like Windows XP in longevity but as a static hardware platform. It's reasonably safe to assume that current and future generation consoles will do this as well for the same reasons, which are that consumers tend to want to use a platform for a longer time and still be able to get new games for them.

If we look at current high end games coming out and how they're adapted to consoles, they already largely have a tight fit to even run on consoles. Some games such as Witcher 3 for example at launch (and still as far as I'm aware) ran at 1920x900 instead of 1920x1080 due to hardware limitations of the Xbox One platform. Even if some combination of Microsoft optimizations to the platform along with further optimizations to the game were to allow it to bump up to 1080p the game still only runs at 30FPS from my recollection.

How will a game that is high end that comes out say at the end of 2016 or sometime in 2017 or 2018 that really pushes the envelope up there cope with the same Xbox One or PS4 hardware? We know that consoles are very popular and that a large portion of developers are going to target them with their games either as a primary platform or a port because there's money in it, but how are they going to get the games to even run on them? They can't make computing resources appear out of thin air of course hehe, so they have really got no choice but to either reduce the complexity and resource consumption of their game engine enough to make it run on the lesser console hardware and thus dumb down the PC version as well to have a single code base, or they have to maintain two variants of their games which some do also. Smaller studios are likely to try to keep a single code base though as they'll have fewer developers to maintain it. As time goes on the problem simply becomes a wider and wider divergence between the cutting edge PC version of such a powerhouse game and the console equivalents.

So dropping the frame rate does make business sense because it is probably the easiest possible change to make in terms of coding. The fact it makes games less playable and less enjoyable is an unintended side effect of course, but it's still the end result.

But if a hot new 2017 title comes out that is very demanding and they don't want to dumb down the graphics etc. for consoles, what then? Do they drop the frame rate to 20FPS or 15FPS, maybe 10FPS? That's not really a question of course, because IMHO at least doing such would be a very terrible thing to consider doing and I don't think the majority of gamers out there would put up with that at all. They can only drop the frame rate so far until it reaches a low point where there are more people actively and vociferously complaining about it than otherwise. I think 30FPS is that low point and anything below that would be game suicide personally. But then if they don't drop the frame rate on these newer games as they come out, how do they make them even run on those consoles? The only other low-cost way to do it is to reduce the on-screen pixel count either by dropping the resolution, culling lines from the game and effectively letterboxing it or perhaps even removing every second/third/fourth line to give a retro "scanline" effect. None of these things is really acceptable to me personally as a gamer, but then I don't play consoles so I'm not the person they'd have to contend with no matter what they do to solve the problem either. :)

I've no problem with a game running at a low frame rate due to insufficient hardware, so long as the same game is technically capable of running at a higher frame rate on properly capable hardware. So for example, I ended up playing The Witcher 3 mostly between 25-30FPS. That wasn't great but it wasn't as bad of an experience as I've had with most games with low frame rates in the past thanks to the motion-blur effect tricking the brain. I'm not upset about the game or CDPR because I know if I upgrade my hardware down the line I can get it to run at 60FPS no problem so the game isn't artificially limited.

What annoys me though is a game being purposefully designed to run at a maximum frame rate that is 30FPS or lower with no option to have a higher frame rate on any hardware at all. That's just a terrible and/or lazy design IMHO and I'd be upset to find a game that did that, and IMHO rightfully so. The only difference on the consoles is that they are a fixed hardware platform so there is no incentive to make the game scale to different hardware understandably. On consoles, the developer must compromise on features in such cases to make them run, whereas on a PC/Mac/Linux platform on PC hardware the developer can choose their minimal hardware configuration and have the game scale up/down to whatever hardware one has above that.

To be honest though, I think this problem is going to solve itself in the next few years because I personally think VR is going to be a big thing overall in the marketplace, and VR is both high resolution and high framerate, so that will dictate what both computers and consoles will need under the hood to play these games. If they prove to be as popular as I think they will, then console makers will be putting out new consoles that are much more powerful and they'll have to update them more often to keep up with current game demands too, and consumers will have to upgrade their console platforms if they want to ride the bleeding edge of gaming. I'm sure that some game devs will put out updated or remade games like "Dr. Mario" to keep less capable consoles having something to do though too. :)
avatar
JMich: ... Do you also notice the difference between 30 and 60 fps on 4X TBS and/or adventure games, or only on games that require you to focus on them a bit more? Then again, not sure how many TBS and/or adventure games support 60 fps.

Just curious, and question is open to all of course, not addressed specifically to you.
I do, but it is much less evident than in other games; there, I really have to observe carefully the characters to notice it -it becomes clearly apparent only when the background slides. The general rule for me is that the more "moving things" there are, the more I see the difference; hence why I can play Commandos (24fps) or the Infinity Engine games (30fps) for more than an hour without feeling sick but I absolutely cannot play a first person game with 30fps or less for more than 10 minutes.

Since I started playing more on the PC (around three years ago), I think my perception also improved: I tried an "experiment" with a friend using a frame limiter, and I can guess the framerate at intervals of 10 up to 60.
Playing on the PC also worsened my problem, though; I have always felt the "side effects" of a low frame rate and had many intervals during play, but now I feel them much more, so much that I almost cannot play any more on my old console without nausea assaulting me. Sometimes (though rarely), I get the same feeling even from movies, especially when the framing slides for a long time.

Edited for typos.
Post edited January 04, 2016 by Enebias
avatar
darthspudius: First time I heard this kind of crap was when people tried to tell me The Hobbit movies running at 48fps was nothing but a cheap tactic to get people viewing. Those movies looked magnificent at high frames per second and I have disliked watching the more primitive movies since.
avatar
jamyskis: I too saw the first Hobbit film in the cinema in 3D and at 48fps and it looked absolutely dreadful. I hate 3D at the best of times, but the high frame rate meant that everything - the CGI in particular - looked horribly artificial. It actually had the same unpleasant sensation that the frame interpolation that seems to be set by default on many modern TVs gives you. I have the 2D Extended Edition on Bluray and I find it so much more pleasant to watch.

The standard 24fps that most films use helps with the suspension of disbelief that many films require nowadays, and the same applies to gaming. Sad as it is, in gaming, high frame rates and higher resolutions tend to do a brilliant job of exposing the flaws in 3D model assets. Quake looked brilliant back in 1996 at a resolution of 640x480. In fact, it still has a certain appeal even today on modern hardware running at the same resolution. But crank it up to 1920x1080 and you'll notice all the jagged edges of the models and environment all the more. And at high frame rates, 3D model animation comes across as more artificial and very stiff - the brain instinctively does a better job of interpolating the animation frames than a computer does. The less detail you feed the brain, the more realistic an image the imagination can form.

As for the 30fps vs. 60fps debate - people shouldn't confuse "don't notice" with "don't care". I of course prefer 60fps, but I honestly prefer a robust 30fps with a good level of visual detail over a silky smooth refresh, but I'm not much of a twitch gamer beyond Street Fighter 4.
CGI in general looks horribly artificial. I do not know why it would be any different in this setting. I also saw them in a standard viewing. Oh the blurriness. I'd rather do without that these days. It is tiring and needs to be phased out already.
So to some it all up it's corporate bullshit trying to tell us what does and doesn't look good.
Because when somebody says to me 30fps is plenty why do you need more? I facepalm.
Personally I don't think even 60fps is 'silky' smooth, but don't get me wrong it's easily playable and I play many of my games in 60fps.

Thanks for the replies guys, been an interesting read to say the least!
avatar
TheCycoONE: Has anyone that claims to be able to notice a big difference done the experiment 'blind'? Placebo effect is very real, so if you know there should be a difference and expect one to look better than the other it's very likely you could perceive such a difference even when it isn't there.

A double blind experiment would be difficult, but it should be simple enough to set up a single blind. Have a friend set the fps, test and check which fps you think it is. Repeat a dozen or so times, and limit communication during / between tests so that you're not sharing clues unintentionally.
The first time I saw a game running at 60fps (Virtua Fighter 2) I didn't know what was going on. Frame rate wasn't something people really talked about then (3D was pretty new) and I had to research what was going on. I didn't know about the game, that it was 60fps, what 60fps was, or that it would be there. I couldn't even tell what game it was from across the room, but I could see it was different.

I also used to play "guess the frame rate" with myself. I would tinker with graphics settings and guess what FPS I was getting. Worked best with UT 2004 which made the stat toggling easy and a CRT set to 85Hz. I could pretty reliably guess the differences when I was getting frames in the 30's, 40's, 50's, and when I was over 60fps. I even got really frustrated when I spent a lot of time getting my UT2004 setting to get me something over 100fps and then noticed immediately it kept dropping when trying to play online. I think it dropped me to 55fps which I noticed not because I had the FPS counter up, but because I could see the difference. I wasn't expecting this difference to occur, and I was able to confirm my perception was accurate after the fact.

Placebo is a thing, but I'm quite comfortable saying so is the difference between 30 and 60. In fact I would go as far as to say it's probably more normal to be able to tell the difference than not.

It is also possible to use YouTube for a quick test yourself. It's not ideal but you can manually toggle 60fps video's quality settings to show you feeds that are HD@60 and SD feeds running at 30 for the same content. Here is one video that works for that.
https://www.youtube.com/watch?v=vxaSEKx9MI4

Be sure to have the "stats for nerds" on to make sure the feed you select is 60fps. Auto may not serve you up 60.

avatar
afarrah20: So to some it all up it's corporate bullshit trying to tell us what does and doesn't look good.
Because when somebody says to me 30fps is plenty why do you need more? I facepalm.
Personally I don't think even 60fps is 'silky' smooth, but don't get me wrong it's easily playable and I play many of my games in 60fps.

Thanks for the replies guys, been an interesting read to say the least!
I think the stuff coming from the devs are them tending to make excuses for why they can't get the game running at 60fps. They need a game to look real sexy in the screen shots, and getting a game to look great at 60fps isn't near as easy as it is at 30fps.

there is always TB's take on this: https://www.youtube.com/watch?v=eXJh9ut2hrc
it links to this which is quick and sweet.
https://www.youtube.com/watch?v=YCWZ_kWTB9w
Post edited January 04, 2016 by gooberking
It's like any of these made up so-called "truths". Like most people cannot tell the difference between Pepsi and Coke or artificial sweetener tastes the same as sugar for most people.

And for all we know, there could be a study. Sometime along the lines of showing two videos one after another and asking if people saw any difference without telling them what they are looking for (that only the frame rate has changed). Under those conditions, I could see people saying they were the same.
avatar
afarrah20: So I see a lot of this "human eye can't detect past 30fps" rubbish online and sometimes "it can't see past 60fps".
Where did that all start and why do people continue to say it when it isn't true? What scientific evidence are they basing their findings on? Or is it a case of "it was on the internet...it must be true Hallelujah!"?

Because I can concurrently prove it 100% false because I did the test on Borderlands 2

30fps to 60fps - could easily tell the difference

60fps to 72fps - meh

72 to 120fps - another big difference, makes 60 look like 30

120 to unlimited (my monitor maximum of 144) - I was surprised to see a small flicker of difference, nothing I'd care about though.

Can we lay this rumour to rest by the power of the PC community?
Supposedly the human mind can only process 40 moments per second according to quantum consciousness theory. I don't know enough about quantum mechanics to defend or rebuff said theory - except that it probably involves a lot of undead zombie cats.
avatar
TheCycoONE: Has anyone that claims to be able to notice a big difference done the experiment 'blind'? Placebo effect is very real, so if you know there should be a difference and expect one to look better than the other it's very likely you could perceive such a difference even when it isn't there.

A double blind experiment would be difficult, but it should be simple enough to set up a single blind. Have a friend set the fps, test and check which fps you think it is. Repeat a dozen or so times, and limit communication during / between tests so that you're not sharing clues unintentionally.
This is probably more to the point of what you are looking for but for 120 vs 60. Taking in both parts it seems to indicate that people do vary, and at least some can see between 60 and 120. And if it's possible to see the difference between 60 and 120 it's probably far more true for the 30 to 60 split.
https://www.youtube.com/watch?v=a2IF9ZPwgDM&feature=iv&src_vid=yWEpIwNDeCA&annotation_id=annotation_169323
I'd guess there's a bit of a variability in what people are capable of, out there. It's - funnily the case in my immediate family. Both my father and my brother have pretty good hearing - 'perfect pitch' - noticing details in sound and music that neither me or my mother perceive. My mother in contrast has fairly bad eyesight (needing thick glasses) but has a sense of smell far better than the rest of us. Can identify people by their smell. To the point that, when she was younger and working in the bank, she could identify who had handled money before her just by the smell of the notes. I am the only one that doesn't wear / need glasses and have (on one eye) 20/20 vision.

I.e. I'd absolutely believe that some people might be able to tell the and note a difference beyond 30 fps and others not. Our brain does a lot of work in 'fixing' our eyesight quite a bit; what we see is pretty heavily 'processed' and 'interpreted' even on basic aspects like focus, 3D vision, even differences in 'calibration' between eyes. I am left eyed, for example, but my brain corrects the 'faulty' perspective of my right eye automatically. Some people just really might have a slightly higher 'mental refresh rate' of their visual sense.