It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
skeletonbow: I rolled my eyes when some people complained about Peter Jackson using 60FPS cameras for The Hobbit movies claiming it wasn't as good blah blah blah. Nonsense there too. 349729384729347234234FPS cameras would have been even better.
In an objective sense, perhaps, but the movies would have looked even worse.
Some interesting responses, more than I thought if I'm honest. Keep it going it's interesting to see what people perceive. I'll check back tomorrow
With as much talk as I've read about it I think there are variances between individuals, but I've always needed to go over 60 to be happy. Back in the CRT days I would put my refresh up to 75 or 85Hz and UT2004 would start to feel a lot smoother for me and my monitor would stop looking like it was humming. Now you have to invest in specific hardware to get over 60FPS and I haven't been able to get there just yet. 60 is OK, but I kind of feel like it's the bottom of what I want and I feel like 30FPS is close to unplayable for a lot of games.

I remember the first time I saw a game running at 60FPS in an arcade. It just showed up one day, and I could see it from afar. It might as well have been on fire with how much it stood out from all the other games in the room. 30 doesn't look anything like 60 as far as I'm concerned.

I had a conversation about cameras with my granddad years ago. I worked in the photography industry (amateur and pro) for years. He was all excited about a little point and shoot had been using and must have said "It takes the clearest pictures" ten times. When he showed me the pictures every one of them was seriously lacking in sharpness. To him they were as clear as anything was going to be, but for me there was a problem. I'm fine with the idea that some people can't appreciated the difference between 30 and 60, but I don't want those people calling the shots on what the bar for quality is. There are way too many of us complaining about 30FPS to be dismissing us.
avatar
skeletonbow: What blows my mind is the fact that people even argue about these things. Logic and science solves the problem pretty reliably. If 60FPS is "ok" and 30FPS is "way better" then 15FPS would be even better! But why stop there? 0.0001FPS would be the best game ever.
avatar
wpegg: If I eat 1 meal a day I am able to carry out my days work while feeling a bit tired. If I up that to two square meals a day, I'm much better off. If I therefore up that to 10 meals a day, I will be very much better off.

It's simple logic people! Why can't you understand this?
That's a rather nonsensical orthagonal example.
avatar
wpegg: If I eat 1 meal a day I am able to carry out my days work while feeling a bit tired. If I up that to two square meals a day, I'm much better off. If I therefore up that to 10 meals a day, I will be very much better off.

It's simple logic people! Why can't you understand this?
avatar
skeletonbow: That's a rather nonsensical orthagonal example.
I just wrote you a really long response, and it got eaten. I mean about 6 or seven detailed paragraphs. I'm sad about that, but will now provide you with the cut down version that you will then dispute, I will not bother responding as I've just lost quarter of an hour to this.

1. You are believing that if you apply a linear effect to the human body it will result in a linear outcome. This is untrue of all biology.
2. When you present a 3D image, it is a lie. The pixel that is perceived as being 3 metres away is actually emitting light from where your screen is, and this causes various conflicts in the perception of the image. Increasing the strength of certain parts of this image can actually disrupt the lie you are trying to present.
3. There are various articles linked that show how the eye can be confused when you start presenting a high fedility image of something that it's not expecting, please read them.

Sorry I can't go into more detail (again), but there's lots to read on this. The core point is that when your starting point is an illusion, you don't want to shine too much light on it.
Partly though, isn't this down to what you're used to?

I mean, yes, perhaps there is an appreciable difference to increased fps above 30, 45 or 60, but isn't it a bit like wine - if you've not had some top drawer wine, then you're going to be happy with the cheaper stuff.

Although, I do have to admit, I did struggle to see the difference between 30 and 60 on the laptop screen I'm on at the moment on that site test.
The eye can see beyond 30 FPS, and you have something wrong with your eyes if you can't.

Or your computer.
It amazes me that a game's entire worth should be reduced to two lonely digits.
Why bother with reviews at all, just list their fps values and buy everything that's 60 or above. Simple.

Yes, I can sometimes see the difference between 30 and 60 fps, but so what ?!
You can make Skyrim and Mass Effect 2 run at 120 fps, but I still won't like them.
I'd much rather play a game I enjoy at below 60 fps.

I only notice the difference in sim racing games, going from Assetto Corsa or Stock Car at 60 fps on my PC to Gran Turismo 5 on my PS3 there's a noticeable difference. Will I stop buying racing games just because they are below 60 fps ? absolutely not, my priority is content.

In action games and shooters there's so much going on in the screen and soundwall that I genuinely don't notice any difference, there's simply too many things to look at and pay attention to.

You need to realize that not everybody is the same, we are not robots who experiences the same things in exactly the same ways, we are all built differently. Some see a difference, others don't, it doesn't f'n matter who sees what and who doesn't.
Post edited January 04, 2016 by Ricky_Bobby
avatar
skeletonbow: That's a rather nonsensical orthagonal example.
avatar
wpegg: I just wrote you a really long response, and it got eaten. I mean about 6 or seven detailed paragraphs. I'm sad about that, but will now provide you with the cut down version that you will then dispute, I will not bother responding as I've just lost quarter of an hour to this.

1. You are believing that if you apply a linear effect to the human body it will result in a linear outcome. This is untrue of all biology.
2. When you present a 3D image, it is a lie. The pixel that is perceived as being 3 metres away is actually emitting light from where your screen is, and this causes various conflicts in the perception of the image. Increasing the strength of certain parts of this image can actually disrupt the lie you are trying to present.
3. There are various articles linked that show how the eye can be confused when you start presenting a high fedility image of something that it's not expecting, please read them.

Sorry I can't go into more detail (again), but there's lots to read on this. The core point is that when your starting point is an illusion, you don't want to shine too much light on it.
Actually, I've got a background in video driver development and graphics programming and am well aware of how the eye perceives computer graphics and the science behind it. I'm also well aware that lower frame rates cause increased eye strain, increased latency between what happens and what the brain perceives and can cause a lot of people to experience motion sickness/vertigo and other unpleasant feelings. I don't experience vertigo or motion sickness myself from low frame rates but I do experience a dramatic decline in the playability and enjoyability of high action high motion games at low frame rates.

I'll gladly dispute any day the idea that 30FPS is better for any game in any way shape or form compared to the same game running at 60FPS or higher. In fact, with the upcoming move to virtual reality gear where the display will be mounted extremely close to the eye, 60FPS is even mandatory in order to avoid motion sickness, headaches and other problems. If the slow/lowend/inferior hardware that makes up the Playstation 4/Xbox One and/or older consoles ever wants to be able to handle VR properly, the games will end up having tonnes of features disabled/removed/dumbed down in order to be able to keep the framerate up to the standard VR will require, or there will be a lot of games being returned to retailers.

30FPS doesn't just suck, it is downright irresponsible.
avatar
Darvond: The eye can see beyond 30 FPS, and you have something wrong with your eyes if you can't.

Or your computer.
Indeed. On DFPs, 60FPS is a rather solid base that games should be striving to always hit on release for the majority of gamers on all platforms the game will release on IMHO, and with 120Hz capable displays available now and Freesync slowly catching on, I'd hope to see games start pushing that framerate up to whatever the hardware can handle if possible, but keeping above 60 as a universal quality standard. If someone has a system that is below the game's specs and gets a lower rate that's fine too, but the consumer's informed option, and many games will be playable under 60FPS for many people. I find many games playable down to 40FPS, and some still playable but definitely greatly suboptimal down to 30FPS. I even played Witcher 3 below 30FPS still playable as long as motion blur was enabled so I didn't experience extreme jitter which is what ruins low frame rates for me.

I for one want my games to show me on-screen an experience that matches the experience I get looking out of a window. If someone was to install one of those electronic shades in my window and turn it on and off at 30Hz while I look out of it, I'm not going to find that to be an improved experience of looking out of the window. Turn that up however and as the frequency climbs the flicker vanishes and persistence of vision takes over with removing any remaining flicker to make it unnoticeable over a certain frequency that will vary from person to person. It's not really that much different with a computer display either even though the exact mechanics of light aren't identical.
Post edited January 04, 2016 by skeletonbow
You people who have eye degrees or what not.. Would an MRI pick up brain waves difference for someone that does notice the difference between playing a game where the framerates are too low versus optimal? Just curious if the effect is psychological or not.

I remember get a lot of adrenaline when I managed to get a game working at 60FPS that otherwise jerked around at 20-25FPS. I was very happy and even more so when I finally got to play it as I wanted to play it.
avatar
Smannesman: Saying "I see the difference" does not constitute proof.
It would be very cool if it did though, everyone could have a PhD and work at NASA.

Hm ..."I see the difference" ... isn't that what Sarkeesian's "research" amounted to ?


avatar
BlackDawn: If the only game you play is minesweeper then even the concept of frames per second is ridiculous :)
What [url=https://www.youtube.com/watch?v=GYhjvOPaSTY ]this [/url]one ?
Post edited January 04, 2016 by Ricky_Bobby
avatar
mistermumbles: I can certainly tell the difference, but anything over 60 is overkill to me. That's smooth enough for me.

As far as movies are concerned, I don't care at all for higher frames there as it oftentimes make them look worse to me. *shrug*
Watching a movie you are also not interacting with stuff, playing a video game your brain is much more active because you are doing more things than simply collecting data through your eyes; you process it and turn it into action.

Which is why simply watching a video showing games at different fps rates is not informative.
The viewer is just observing and not interacting, artificially increasing the likelihood of detecting a difference.
Research based on interaction, i.e. actual gameplay, is more informative.
However this too depends on many variables: the gaming background of the test person, how good or bad their eyesight is, how fast their brain can process visual information, and so on.
avatar
Ricky_Bobby: Research based on interaction, i.e. actual gameplay, is more informative.
Or you could try it yourself. After all, it's about personal experience.

Download Nvidia Inspector, MSI AfterBurner, EVGA Precision or any other tool that can limit your frame rates. You could then even try between different games or different engines.
It’s one of the greatest gaming deceptions of our generation. The sheer gall of publishers to look their audience dead in the eyes and tell them the sky is purple. It’s such a ridiculous statement said so matter-of-factly that a lot of people end up doubting themselves when they look up. ‘Hang on, I think… yes I think I see a tinge of purpl-‘ No you don’t.

I recall Ubisoft used the most popular version of the lie when hyping the dismal Assassin’s Creed Unity, comparing 60fps to being ‘a bit like The Hobbit movie’ and saying ‘30fps feels more cinematic.’ Perhaps the misconceptions of some gamers come from looking at side by side comparison shots on Youtube and being unable to tell the difference. Games have to be played. Seeing the difference is one thing, feeling it is another – the smooth animations and responsiveness of the controls cannot be understated. Sadly framerate and resolution are the first on the chopping block while resources are dedicated to making the game look pretty for those promotional screenshots, despite being far less important.
avatar
Ricky_Bobby: You need to realize that not everybody is the same, we are not robots who experiences the same things in exactly the same ways, we are all built differently. Some see a difference, others don't, it doesn't f'n matter who sees what and who doesn't.
Ahhh, it may not matter to someone that sees 30fps as good if a game runs at 30, 60, 120 or what have you. It does matter a whole hell of a lot to the guy that is significantly bothered by 30fps.

If we actually acknowledge that people see things differently then that is strength to the argument that everything should be shooting for at least 60fps. That way 30fps people are happy and so are the 60fps people. When 30fps people come out saying they can't see the big deal and therefore 60fps shouldn't be a priority for anyone, they are making a decision for 60fps people that the 60fps people aren't going to be happy with.

That's the big weird thing about the issue. 30fps people keep saying there is no point to 60fps even though 60fps doesn't really hurt them, but that dialog does hurt the 60fps crowd. Why does the 30FPS crowd even care that there are people that would like 60fps to be a serious performance target?
Post edited January 04, 2016 by gooberking