Darvond: The problem being is that consoles still exist in spite of us being in a world where consoles are totally pointless, being closed garden PCs, at best.
Or a low quality toy in the case of most Nintendo products these days.
Yes, it is definitely a problem. Problem for video game developers and a problem for consumers. I'm not stating that there is no problem, but rather that there very much is a problem and my premise is that 30FPS isn't a good solution. It may turn out to be the solution chosen by a developer, or even the only solution possible without a massive undertaking but the lack of options doesn't make that one a good one.
What I wonder is what happens in 1/2/5 years as newer and newer games come out that further push what is possible in gaming on the PC platform both with and without VR. As games use more and more CPU and GPU power, and likely FPGA power in years to come they'll become ever more hungry for resources. If older consoles are any sign of a pattern in the console market they've shown that they tend to come out and last for quite a long time in the marketplace, kind of like Windows XP in longevity but as a static hardware platform. It's reasonably safe to assume that current and future generation consoles will do this as well for the same reasons, which are that consumers tend to want to use a platform for a longer time and still be able to get new games for them.
If we look at current high end games coming out and how they're adapted to consoles, they already largely have a tight fit to even run on consoles. Some games such as Witcher 3 for example at launch (and still as far as I'm aware) ran at 1920x900 instead of 1920x1080 due to hardware limitations of the Xbox One platform. Even if some combination of Microsoft optimizations to the platform along with further optimizations to the game were to allow it to bump up to 1080p the game still only runs at 30FPS from my recollection.
How will a game that is high end that comes out say at the end of 2016 or sometime in 2017 or 2018 that really pushes the envelope up there cope with the same Xbox One or PS4 hardware? We know that consoles are very popular and that a large portion of developers are going to target them with their games either as a primary platform or a port because there's money in it, but how are they going to get the games to even run on them? They can't make computing resources appear out of thin air of course hehe, so they have really got no choice but to either reduce the complexity and resource consumption of their game engine enough to make it run on the lesser console hardware and thus dumb down the PC version as well to have a single code base, or they have to maintain two variants of their games which some do also. Smaller studios are likely to try to keep a single code base though as they'll have fewer developers to maintain it. As time goes on the problem simply becomes a wider and wider divergence between the cutting edge PC version of such a powerhouse game and the console equivalents.
So dropping the frame rate does make business sense because it is probably the easiest possible change to make in terms of coding. The fact it makes games less playable and less enjoyable is an unintended side effect of course, but it's still the end result.
But if a hot new 2017 title comes out that is very demanding and they don't want to dumb down the graphics etc. for consoles, what then? Do they drop the frame rate to 20FPS or 15FPS, maybe 10FPS? That's not really a question of course, because IMHO at least doing such would be a very terrible thing to consider doing and I don't think the majority of gamers out there would put up with that at all. They can only drop the frame rate so far until it reaches a low point where there are more people actively and vociferously complaining about it than otherwise. I think 30FPS is that low point and anything below that would be game suicide personally. But then if they don't drop the frame rate on these newer games as they come out, how do they make them even run on those consoles? The only other low-cost way to do it is to reduce the on-screen pixel count either by dropping the resolution, culling lines from the game and effectively letterboxing it or perhaps even removing every second/third/fourth line to give a retro "scanline" effect. None of these things is really acceptable to me personally as a gamer, but then I don't play consoles so I'm not the person they'd have to contend with no matter what they do to solve the problem either. :)
I've no problem with a game running at a low frame rate due to insufficient hardware, so long as the same game is technically capable of running at a higher frame rate on properly capable hardware. So for example, I ended up playing The Witcher 3 mostly between 25-30FPS. That wasn't great but it wasn't as bad of an experience as I've had with most games with low frame rates in the past thanks to the motion-blur effect tricking the brain. I'm not upset about the game or CDPR because I know if I upgrade my hardware down the line I can get it to run at 60FPS no problem so the game isn't artificially limited.
What annoys me though is a game being purposefully designed to run at a maximum frame rate that is 30FPS or lower with no option to have a higher frame rate on any hardware at all. That's just a terrible and/or lazy design IMHO and I'd be upset to find a game that did that, and IMHO rightfully so. The only difference on the consoles is that they are a fixed hardware platform so there is no incentive to make the game scale to different hardware understandably. On consoles, the developer must compromise on features in such cases to make them run, whereas on a PC/Mac/Linux platform on PC hardware the developer can choose their minimal hardware configuration and have the game scale up/down to whatever hardware one has above that.
To be honest though, I think this problem is going to solve itself in the next few years because I personally think VR is going to be a big thing overall in the marketplace, and VR is both high resolution and high framerate, so that will dictate what both computers and consoles will need under the hood to play these games. If they prove to be as popular as I think they will, then console makers will be putting out new consoles that are much more powerful and they'll have to update them more often to keep up with current game demands too, and consumers will have to upgrade their console platforms if they want to ride the bleeding edge of gaming. I'm sure that some game devs will put out updated or remade games like "Dr. Mario" to keep less capable consoles having something to do though too. :)