afarrah20: So I see a lot of this "human eye can't detect past 30fps" rubbish online and sometimes "it can't see past 60fps".
Where did that all start and why do people continue to say it when it isn't true? What scientific evidence are they basing their findings on? Or is it a case of "it was on the internet...it must be true Hallelujah!"?
Because I can concurrently prove it 100% false because I did the test on Borderlands 2
30fps to 60fps - could easily tell the difference
60fps to 72fps - meh
72 to 120fps - another big difference, makes 60 look like 30
120 to unlimited (my monitor maximum of 144) - I was surprised to see a small flicker of difference, nothing I'd care about though.
Can we lay this rumour to rest by the power of the PC community?
Yup, it's all pure marketing bullshit by gaming companies coming out with new high end games and trying to play them on the newest generation of consoles and their complete utter crap hardware. They're unable to get their amazing game/engine to work at a reasonable framerate without majorly dropping features and stalling development and they have shareholders to please. So what do they do? Simple, try to start a new fad by saying that something that sucks ass is awesome and convince all the stupid people it is true so they eat it up hook line in sinker.
1990s: Vinyl sounds way better than CDs and 5.1 surround!
2000s: Gold plated audio cables produce better audio!
2015: Games are way better at 30FPS!
*: Highly diluted "holistic medicine"
What a load of shit.
Such complete utter total crocks of shit it isn't funny. The best frame rate that a game could possibly have is infinity simulating analog real-world light transfer from a light source directly into the human eye. Anything less than infinity is increasing degrees of shit, mostly not noticeable very much to the human eye until we get under a couple hundred Hertz where it starts to become more and more noticeable the further you go down. I play some games at 30FPS but only because that's all I can get out of them with my current hardware, but 30FPS sucks ass no matter how you slice it. Not quite as bad as vinyl audio though, or listening to music on a tube amplifier or some other nonsense like that with all kinds of hocus pocus pseudoscience fantasy.
For high level action games where there is a lot of motion, in particular FPS/TPP games with a lot of horizontal motion, 45FPS is reasonably tolerable with many games, 60FPS is IMHO the minimum for a solidly decent experience, but getting higher rates on a display capable of it such as 120Hz would be greatly desired.
I rolled my eyes when some people complained about Peter Jackson using 60FPS cameras for The Hobbit movies claiming it wasn't as good blah blah blah. Nonsense there too. 349729384729347234234FPS cameras would have been even better.
The only games where the frame rate doesn't really matter is games that do not do a lot of movement on the screen and visual latency isn't really an issue. Games like Microsoft Solitaire and Minesweeper for example. Those would work well probably at 5FPS even although the card shuffle would look a little weird in Solitaire. :)
What blows my mind is the fact that people even argue about these things. Logic and science solves the problem pretty reliably. If 60FPS is "ok" and 30FPS is "way better" then 15FPS would be even better! But why stop there? 0.0001FPS would be the best game ever. You just have to wait 300 years (pulled that number out of my ass, but if some mathematician wants to figure it out be my guest. <grin>) in between screen updates but it makes the game so much more awesome, especially if you turn on the audio option to simulate vinyl audio!
:oP