It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Basically the difference is just the amounts of images you can put in one second. More images means a more fluid animation, but as DProject said there is so much the human eye can perceive.
But leaving games aside, you can see the difference more FPS will give you on "The Hobbit" movie, in its HFR version. The movement is not fast but fluid, there is almost no need for a motion blur to make things look more "real", and as an amateur animator that is actually pretty cool. To me, to much motion blur is distracting.
Half.
avatar
Punished_Snake: Kojima said, gameplay side, old and next gen version will be exactly the same. At first. Then he stated that each version comes with specifical ladder, because there are differences between old and next gen. I can't understand nothing :|
avatar
stg83: The best version of MGS V will be on next gen or current gen i.e. PS4 and Xbone graphically, Kojima can't come out and say that directly because he still wants the game to sell as much as it possibly can on every platform. For a comparative example you'll see that Watch Dogs on Ps3/Xbox 360 consoles looks different compared to the PS4 and Xbone version that you can hardly find any gameplay videos of it, but people who have seen it confirm that it looks much better on PS4 and Xbone.
Just found this:

http://www.metalgearinformer.com/?p=10710

If it's true, there are no problem, I will grab PS3 version now and 4 later for 10€.
avatar
stg83: The best version of MGS V will be on next gen or current gen i.e. PS4 and Xbone graphically, Kojima can't come out and say that directly because he still wants the game to sell as much as it possibly can on every platform. For a comparative example you'll see that Watch Dogs on Ps3/Xbox 360 consoles looks different compared to the PS4 and Xbone version that you can hardly find any gameplay videos of it, but people who have seen it confirm that it looks much better on PS4 and Xbone.
avatar
Punished_Snake: Just found this:

http://www.metalgearinformer.com/?p=10710

If it's true, there are no problem, I will grab PS3 version now and 4 later for 10€.
hmm.. yes that is really good if true, then it wouldn't be a problem to get the PS3 version.
avatar
Austrobogulator: Have a look here.
Nice. 30 and 60 look very similar to me. Maybe my brain can only process 30 fps and throws the rest away?
avatar
Punished_Snake: So I preordered Metal Gear Solid Ground Zeroes for PS3, but now there is also a retail version for PS4.

I haven't a PS4 yet, but soon or later I wanna take one, just to play games like Uncharted.

But I wanna ask a question: what is the difference between 30 and 60 fps?

I mean, 60 fps is supposed to be fastest, but I've seen several videos on Youtube (Sleeping Dogs, Battlefield 3, Call of Duty Ghosts) and all games seem the same, for me. Even the speed.

Kojima said that PS4 version of Ground Zeroes runs at 60 fps, the Ps3 one at 30, but I can't find differences...am I blind?
http://en.wikipedia.org/wiki/Frame_rate

in other words, in a video game higher frame rate smooths out motion of fast moving objects, but due to the nature of the beast it will never be good enough. in film 46 fps is the minimum to prevent eye strain, 72 is standard. now there is a very small percentage of the population (less than 1%) that have brains that process images faster, these people tend to be able to react faster to visual stimulus, and have trouble seeing movies (as they are just some flickering images). typically your brain adapts to the FPS that you are being subjected to, and thus if you are someone who enjoys older games with 15 FPS then 30 FPS can be noticeable, but not profound, and 60 would be hard to spot in the first place. if you do play games with 60 FPS, enough for your brain to readjust, then 30 will look like crap, much like going from high quality analog sound to high quality digital sound, it is a bit choppy. given that the average person can detect periods of darkness of .016 secs, 62.5 FPS should be the point at which any better is pointless, assuming that FPS is all that matters (which it doesn't).
A word of advice these so called 60 fps on "next gen" consoles isn't exactly true. Most games out on the ps4/xone reach 60 fps but will dip to 40-45 most of the time so don't expect anything revolutionary.

Also 30 fps is really noticeable on pc games where you sit right in front of the screen (especially if you play FPS games) and the good thing about the human brain/eyes is that it adepts. So if you switch to 30 fps, you won't notice any difference after a while, but you will notice fps dips quite well.

Also buying a console for one game :p you must be rich or a die hard fan. Hope the new MGS doesn't disappoint you since ground zeroes is just a prologue.
avatar
DProject: As you probably know, FPS stands for Frames Per Second. Or, how many still images are you seeing per second. Movies usually have an FPS of 23.976, 25, or 29.97.

As for your question, what's the difference? None, in my opinion. I can play games just fine with FPS "only" being 30. The human eye can't even react fast enough to see all 60 frames per second. Apparently, the higher the FPS the more realistic everything looks (or something): for racing games and first person shooters people appreciate a higher FPS. But I think they're just full of shit. To me, it feels like the "Bit Wars" in the 90's; people were comparing the amount of bits without even knowing what they mean. Nowadays, if a game is "only 30 FPS", it's crap or whatever.
Not quite so simple:

http://www.100fps.com/how_many_frames_can_humans_see.htm
To answer with some personal experience, FPS really only makes a difference for me in 2 scenarios. I'm fine playing most things at 30fps and don't really notice the difference between 30-60. One scenario where it does matter though is when it goes down to something less than 30, around 10-15 fps appears to be stuttering and you can't see response 'real time' in the game. The other scenario where it matters to me is on the other end of the spectrum. When playing first person shooter games against other people. I tend to prefer to tone the graphics options down to maximize my FPS in this scenario since visually I may not see much if any difference between 30 FPS and 90 FPS or higher but with an already inherent latency to what a game is rendering due to network/internet delay and to the game engines response to your input transferring back to a server which is doing the calculations on what you hit or don't hit the higher the FPS the better the chance of seeing and responding to something before you get killed. Slower FPS just adds more time between refresh cycles so you have that much wider of a time period or slice of time for something to happen before it even starts to show on your own client, and that just adds to the network latency already involved.

From personal experience in first person shooter games on the PC, when I only have 30 FPS I do much worse on the same game than when I have 60 FPS or better. For single player games it doesn't really make much difference from what I have seen. Because of internet and network lag and any other latency causing issues including differing refresh rates many first person shooters use various interpolation or other techniques to try and even out the difficulty for all players, but, having a higher FPS always gives something of an edge even if it's its only in milliseconds.

In some games there has been a more direct correlation (though bad design IMO) in that regeneration rates or resource collection rates or some such timed element of the game has been programmed to update based on ones FPS so higher FPS gives an advantage. And the converse is also true some games are written to give an advantage to the player with a slower system in order to try to even the playing field, but as with anything of that nature, players can learn to exploit these weaknesses to turn them into an unfair advantage.
avatar
Punished_Snake: I can't decide. I mean, I have no reason to buy a PS4 now (there aren't games I'm interested for), but I wanna play the best version for Ground Zeroes and, soon or later, I will grab a Ps4 anyway (Uncharted 4, Kingdom Hearts 3 are the reasons, for now).
If you really wanna play best version, why not wait until Kojima release PC version? Since they release MG Rising on pc, I bet they'll do it for MGS V (also I remember Kojima saying thay). With the muscle of modding community, I totally sure pc version its gonna be the most beautiful and amazing version of all (well as long as you have a beast pc).
The difference is the feeling.
Only a very few people can "see" the difference between 30 and 60 fps cause for the eyes there is no difference. But your brain tells you that the later is smoother.
The real catch is how the programmer realize the more frames. With modern hardware there should be no problems but there are still games with pop ups and reduced visibility.
A decision only based on fps is not the best.
avatar
DProject: As you probably know, FPS stands for Frames Per Second. Or, how many still images are you seeing per second. Movies usually have an FPS of 23.976, 25, or 29.97.

As for your question, what's the difference? None, in my opinion. I can play games just fine with FPS "only" being 30. The human eye can't even react fast enough to see all 60 frames per second. Apparently, the higher the FPS the more realistic everything looks (or something): for racing games and first person shooters people appreciate a higher FPS. But I think they're just full of shit. To me, it feels like the "Bit Wars" in the 90's; people were comparing the amount of bits without even knowing what they mean. Nowadays, if a game is "only 30 FPS", it's crap or whatever.
TOTALLY agree with you. The only reason I can see higher FPS being of use is in cinema: such as fluid slow motion
I play mostly first-person shooters, and I can't tell if it is above 25. And I'm not just saying 25 because it is a standard film rate. I honestly just can't tell a difference until it gets closer to 20.
If I am getting steady 60 in a house, and I go outdoors and it drops to 25, I don't notice a thing. Then again, my eyesight is total shit.
avatar
Fever_Discordia: Films are only 24fps and the Hobbit (the new, second one) experimented with some screenings showing a 48fps and audiences thought it looked weird
http://www.hollywoodreporter.com/news/hobbit-desolation-smaug-48-frames-655444

So 30fps should be plenty, really!
Well 30fps makes things look "cinematic" where as TV use a higher frame rate. A lot of the criticism I heard of higher frame rates in movies was saying it made it look more like a TV soap. People are used to movies having a kind of blurry look and for the most part hardly mind the judder and strobing. Also people are unused to movies looking more real and less movie like.
Kristian already posted it but I'll post it again: http://www.100fps.com

There's a lot of misinformation out regarding limits of how many frames humans can see and what not. As with everything else avoid sources where people say "I don't really see a difference between X and Y" or "My experience is.." and read articles that want to take a scientific approach to the subject.