It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Breja: But maybe something's just wrong with me. For example I can hardly see even the "best" 3D in movies.
Wrong is such a strong word. I prefer different. As for 3D, I see the appeal but it's nothing special to me. In fact, I watched The Hobbit in 48FPS and 3D and I would have preferred 24FPS and no 3D. The 48FPS gave it a soap opera effect that really kills immersion for me and the 3D makes everything look papery somehow. I would also have preferred a different movie but that's a difference discussion. :-)

avatar
mistermumbles: but anything over 60 is overkill to me. That's smooth enough for me.
I wouldn't know, I don't dare to try as it might increase my demand lol, but yes, 60FPS is good enough for me too.
I think it's like audiophiles and frequency above 48KHZ. Thye believe it's better, yet there isn't really a perceivable difference. It's all in your mind. Otoh i now read on wikipedia that humans are suposed to percieve things at around 75-150 fps. So i guess you might see something up to 75. After that probably only a minority would see the difference.
Post edited January 03, 2016 by blotunga
I was so disappointed when my friend said: "I won't play The Witcher 3 because I couldn't run it at 60 fps" :(

Personally, I capped it to 30 fps and high settings, and enjoyed the game.
Post edited January 03, 2016 by Random_Coffee
I presume it comes from movies, like "see how that 24fps movie (with motion-blur) looks totally smooth, so yeah, it is as smooth as it gets and you can't see past that, right?". I haven't tried past 60 fps, but yeah I can definitely see and feel the difference easily when e.g. playing Team Fortress 2 at either 30 fps or 60 fps.

I feel 30 fps is generally fine for me (especially for single-player games), but 60 fps feels smoother, no question about it. In practise that means that for single-player games (say, The Witcher series and such) I try to crank up graphics options just enough so that I reach (hopefully consistent) 30 fps.

For competitive multiplayer games, I rather crank down graphics options and resolution so that I get consistent 60 fps. If that is not possible, then 30 fps is fine.

I am referring to 3D FPS/action games mainly. 2D RPGs, adventure, strategy games etc. are fine with much lower framerates.
Post edited January 03, 2016 by timppu
avatar
Random_Coffee: I was so disappointed when my friend said: "I won't play The Witcher 3 because I couldn't run it at 60 fps" :(
I've finished TW3 on an average of 40 fps and i was quite satisfied with it.
avatar
Nirth: Check this

Anyone here that does not see the difference between 30FPS and 60FPS? Excluding faulty or low hardware.
In that particular example, I actually did have problem discerning the difference between 30 and 60 (15 was easy to tell apart). Maybe 60 fps was slighly smoother, but only slightly.

However, I think that is simply because the green object in that test was moving so slowly on the screen. The faster the movement, the easier it is to tell.

I recall seeing some other comparisons online where the difference was far more apparent.
avatar
afarrah20: So I see a lot of this "human eye can't detect past 30fps" rubbish online and sometimes "it can't see past 60fps".
Where did that all start and why do people continue to say it when it isn't true?
I assume it was someone who saw my list of people to throttle and decided it wasn't long enough.
avatar
timppu: In that particular example, I actually did have problem discerning the difference between 30 and 60 (15 was easy to tell apart). Maybe 60 fps was slighly smoother, but only slightly.
Me too. I guess that the early research is still relevant: 24 is the limit. Anything below it is very noticeable, but the further you go above it, the less the difference is noticeable to an untrained eye. So going from 20 fps to 30 fps is definitely an improvement, but from 30 fps to 60 fps will not get many people excited.
Actually 24 was an arbitrary middle between 22 and 26, the framerate of early silent movies. However even Edison proposed 48 fps. Also 72 fps for cinema is tested. However above 75 most of us won't see any difference.
Generally 30fps is fine as long as it's consistent but if a racinggame is limited by 30 fps I won't play it as you get no sense of speed and it is just painful to play.
avatar
BlackDawn: If the only game you play is minesweeper then even the concept of frames per second is ridiculous :)
It's the theoretical difference between 30 or 60 moves per second.
From what I can remember at the time (I was a young teen), 30 fps was basically just a target you weren't going to get beyond. They were syncing it to the typical refresh rate of the monitor at 60hz, and the graphics cards just weren't good enough to hit 60.

I think at the time people tended to say that 30fps was all you needed, rather than all you were able to perceive. It's also noticeable that back then the textures were lower res, and everything was smaller, so it might have contributed to the belief that it wasn't worth chasing 60fps.

My unsubstantiated theory - it was to justify old graphics cards running the new FPS games of the time like Quake and Unreal.
whenever these debates come up, I always can't help but get a bit disillusioned or even sickened by what I know to be a significant aspect in it in the influence of those who spent more on their pc hardware flouting their ego thingy or status symbol. and that's fine... I guess. but when it blurs the line between science and social it starts to irritate a bit.

before we had 120hz, 144hz displays, people would go on about high FPS but nobody would talk about tearing. that was just unbelievable. I hate tearing. I can put up with a framerate but I hate tearing. so these people would choose tearing over lower FPS? what?

I have a decent but older now monitor with a nice low latency but it doens't do high refresh rates. I'm completely happy with vsync at 60 and never feel like the motion is anything but fluid. whereas sometimes in a 30fps game you can detect the frames passing, at 60 it never bothers me. so it's not a problem.

these days, nice AA and lighting effects combined with that motion blur they do when you move the camera quickly pretty much destroys any framerate related dissonance at 60fps, if any even exists.
avatar
Strijkbout: Generally 30fps is fine as long as it's consistent but if a racinggame is limited by 30 fps I won't play it as you get no sense of speed and it is just painful to play.
avatar
BlackDawn: If the only game you play is minesweeper then even the concept of frames per second is ridiculous :)
avatar
Strijkbout: It's the theoretical difference between 30 or 60 moves per second.
pretty sure I used to play episode 1 racer sub 60. felt blisteringly fast.
Post edited January 03, 2016 by johnnygoging
avatar
afarrah20: So I see a lot of this "human eye can't detect past 30fps" rubbish online and sometimes "it can't see past 60fps".
Where did that all start and why do people continue to say it when it isn't true? What scientific evidence are they basing their findings on? Or is it a case of "it was on the internet...it must be true Hallelujah!"?

Because I can concurrently prove it 100% false because I did the test on Borderlands 2

30fps to 60fps - could easily tell the difference

60fps to 72fps - meh

72 to 120fps - another big difference, makes 60 look like 30

120 to unlimited (my monitor maximum of 144) - I was surprised to see a small flicker of difference, nothing I'd care about though.

Can we lay this rumour to rest by the power of the PC community?
Yup, it's all pure marketing bullshit by gaming companies coming out with new high end games and trying to play them on the newest generation of consoles and their complete utter crap hardware. They're unable to get their amazing game/engine to work at a reasonable framerate without majorly dropping features and stalling development and they have shareholders to please. So what do they do? Simple, try to start a new fad by saying that something that sucks ass is awesome and convince all the stupid people it is true so they eat it up hook line in sinker.

1990s: Vinyl sounds way better than CDs and 5.1 surround!
2000s: Gold plated audio cables produce better audio!
2015: Games are way better at 30FPS!
*: Highly diluted "holistic medicine"

What a load of shit.

Such complete utter total crocks of shit it isn't funny. The best frame rate that a game could possibly have is infinity simulating analog real-world light transfer from a light source directly into the human eye. Anything less than infinity is increasing degrees of shit, mostly not noticeable very much to the human eye until we get under a couple hundred Hertz where it starts to become more and more noticeable the further you go down. I play some games at 30FPS but only because that's all I can get out of them with my current hardware, but 30FPS sucks ass no matter how you slice it. Not quite as bad as vinyl audio though, or listening to music on a tube amplifier or some other nonsense like that with all kinds of hocus pocus pseudoscience fantasy.

For high level action games where there is a lot of motion, in particular FPS/TPP games with a lot of horizontal motion, 45FPS is reasonably tolerable with many games, 60FPS is IMHO the minimum for a solidly decent experience, but getting higher rates on a display capable of it such as 120Hz would be greatly desired.

I rolled my eyes when some people complained about Peter Jackson using 60FPS cameras for The Hobbit movies claiming it wasn't as good blah blah blah. Nonsense there too. 349729384729347234234FPS cameras would have been even better.

The only games where the frame rate doesn't really matter is games that do not do a lot of movement on the screen and visual latency isn't really an issue. Games like Microsoft Solitaire and Minesweeper for example. Those would work well probably at 5FPS even although the card shuffle would look a little weird in Solitaire. :)

What blows my mind is the fact that people even argue about these things. Logic and science solves the problem pretty reliably. If 60FPS is "ok" and 30FPS is "way better" then 15FPS would be even better! But why stop there? 0.0001FPS would be the best game ever. You just have to wait 300 years (pulled that number out of my ass, but if some mathematician wants to figure it out be my guest. <grin>) in between screen updates but it makes the game so much more awesome, especially if you turn on the audio option to simulate vinyl audio!

:oP
avatar
blotunga: Actually 24 was an arbitrary middle between 22 and 26, the framerate of early silent movies. However even Edison proposed 48 fps. Also 72 fps for cinema is tested. However above 75 most of us won't see any difference.
The problem with higher framerates for movies is that, as the article I posted earlier mentions, at higher framerates our eyes are able to see much finer details in the image. Among other things that means that we can suddenly see that the surface of that castle wall isn't actually stone, it's just painted plywood. As such, higher framerates require much higher standards for sets, costumes and special effects than lower framerates do, or the movie will look awful to the audience.
avatar
skeletonbow: What blows my mind is the fact that people even argue about these things. Logic and science solves the problem pretty reliably. If 60FPS is "ok" and 30FPS is "way better" then 15FPS would be even better! But why stop there? 0.0001FPS would be the best game ever.
If I eat 1 meal a day I am able to carry out my days work while feeling a bit tired. If I up that to two square meals a day, I'm much better off. If I therefore up that to 10 meals a day, I will be very much better off.

It's simple logic people! Why can't you understand this?