It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
jamyskis: It's a safe bet that this generation is going to last a while. The leap in visual quality hasn't been anywhere near as pronounced as previous generations and I have a feeling that if a new Sony or Microsoft console comes any sooner than 10 years from now, upgrade fatigue is going to set in.
The biggest leap forward has been in increasing the resolution and thus the detail. Which might not sound like much, but it's the reason these games now take up a Blu-Ray, whereas the same game would have taken up one DVD last gen.

True enough, if you're playing these games on a small screen then the difference is going to hardly register. But then who's playing a PS4 or Xbone game on a small screen?

You could therefore argue that the upgrade is a necessary one dictated not by how much better the PC was than the last gen consoles at the end of their shelf life, but by the ever increasing demand for bigger and more detailed TVs. Now there are 4K screens and already we're faced with a situation where neither of the current generation of consoles can handle a current gen TV's pixel depth.

In fact, the only reason for owning a 4K screen is if you've got a suitably powerful PC. So already we've got the beginnings of a situation where the current gen are lagging behind in precisely the way that necessitated the demise of the previous generation.

How long it's going to take Sony and Microsoft to upgrade is almost impossible to predict right now. You may well be right about it being a decade after the release of the current gen. But I wouldn't be surprised it someone else tried to take advantage of what could at some point between then and now become a recognised deficiency.
That really strikes me as the question the guy with two Titan X cards would ask, especially this early in the console cycle.

I got a pretty good PC a couple of years ago and it still runs games quite well, but with brand new superfluous features constantly being pushed, new drivers meant to intentionally underpower older cards, and shoddy ports, I don't know for how long this PC will last.

This whole discussion feels very short sighted. Sure, if you do have your Titan X's I bet you want the most out of it, but don't act like you speak for everyone.
avatar
Navagon: <snip>
Well, the majority of people own 42" TVs at most, and when I refer to a technological leap, I mean that this really has been the first generation where the previous generation of consoles could easily have supported most of the games of this generation. Of course we have some extremely sloppy ports to last-gen consoles (Shadow of Mordor) and some that have actually made use of the increases in memory and CPU performance (Assassin's Creed: Unity), but for the most part, games on the PS4 and Xbox One could easily be implemented on the last-gen consoles with a drop in resolution, effects density, texture resolution and slight penalties in terms of frame rate.

It's not like the PS2>PS3 transition, where PS2 versions were sometimes markedly different from their PS3 versions, and where there were very few cross-gen titles. Not only have there been significantly more titles, but the differences between them usually range from barely noticeable to minimal (even late-in-the-day cross-gen titles like Tales of Zestiria).
Post edited November 01, 2015 by jamyskis
I don't buy many games today because of the downgrade of games to PC from Consoles. I'm not just talking about graphics but all aspects of PC gaming. As PCs ever grow in power. Consoles hold us back. AAA brands do what is cheapest make multi platform games aka console ports.

I'll use The Witcher 3 as an example. When I first saw how Cdprojekts developed it I was amazed on how the game looked. But the ideas they said they would put on the table and offer us PC gamers was amazing. The final product wasn't what was exactly promised and instead of a PC version we had a downgraded game. So consoles could look as good as PCs.

I remember when watchdogs first early shots showed some impressive graphics and I said cool. I wasn't really interested in that game so I never bought it but I read all about the fallout from it. Then there was Aliens Colonial marines flop. And it goes on and on.

All the leading game brands are screwing PC gamers. Last EA I bought is almost 10 years ago. THQ was ages. Deepsilver 6 years. Bethseda was oblivion which was another downgraded game from its original concept that we PC gamers was expecting from them.

I admit there are times I fall for it. Like witcher 3. I believed in the product and didn't hear about the rumors of Downgrading till a few weeks before its released. My worry wasn't just about the graphics but all of it. How the AI would react. How the controls would function. How smooth the open world would be.. Mature story compared to its predecessors. When it was release the game was so broken so shit. I struggled through the game story which to my surprise was missing content. All the events in the first two stories had no impact in W3. Meaning my adventure was a waist of time. At least for Mass Effect 3, they had the courtesy to screw gamers at the end of the game.

I don't buy new games anymore. I'll wait for a long long time when all the patches and DLCs are all included in a bargain bin. Then I may consider purchasing it. But has to be on gog without DRM. :P

I don't hate consoles I just never like them. I always found games on those platforms lesser than on PC. I like the freedom I have on my PC and what I can do to games. On consoles your abilities to tweak, mod or reverse engineer can lead to issues.

I do want PC games to advance as technology grows. The point of PC gaming is breaking boundaries on hardware and software.

But I get it. They guys will sacrifice anything just to get a few more dollars into there pockets. ;)
I understand the complaints, but also it seems like with PC going beyond 1080p, that itself would necessitate a slowdown in other visual advances.

Plus I'm very happy that I was able to play though Bioshock Infinite with a crappy GPU even with mostly high/ultra settings. It helps that I am tolerant of low FPS.
I don't think consoles are a significant bottleneck for this generation. That is because as technology and techniques improves, it becomes increasingly difficult to advance further. You can wring only so much power out of silicone, and I expect that it would take something like optical computing before we can have a major jump in processing ability.

That is why research and development of areas that haven't received much investment would have much greater yields than graphics. Unfortunately, areas such as physics and artificial intelligence lack the simple "wow!" factor that visuals offer. Dwarf Fortress is an incredible game in how it simulates a world, but you wouldn't be able to communicate that within the confines of a screenshot.
avatar
PookaMustard: What they should be complaining about is that graphics are holding back gameplay. Anyone with me?
Absolutely. After all, the vast majority of a game's budget is the developer's salaries and the graphics in modern AAA games take a lot of work. Just imagine if they spent that development time and manpower on making the game play better instead of look better?
avatar
SirPrimalform: Just imagine if they spent that development time and manpower on making the game play better instead of look better?
You'd get Thomas was alone... :P
The people who said that a game doesn't need to have great graphics to be a great game, deserved a +rep. Although their statements may not have a point here at this thread since we're discussing about something else, people still need to remember that graphics may have a limit. When that limit is finally reached, will we then realize that games don't need to have good graphics to be considered great?

In my opinion, I'm quite okay with the current advancement of PC games. I said PC games instead of only PCs is because I think the problem comes from the developers of the games themselves, since like what you said, the devs just port the console versions over to PC with standard upgrades. If this were to continue, it would prevent us from knowing the full capability of the most powerful graphics cards in the market right now.

Increasing the pace of the advancement of PC games on the other hand would push technology companies such as NVIDIA to invent new graphics cards that outperform the previous versions. This would lead to people who have low to medium-end PCs and laptops to be outclassed very quickly. That previous statement may be biased, because I, myself, is one of those people that have medium-end laptops. Yes, a laptop.

So the answer to your question on whether I want PCs to advance at a faster rate or not is no. Also, for this other question you asked, "Do you think consoles hold back graphics to a degree?", I have to say that the difference between console and PC games is still considered to be little in my perspective. This means I can't answer that question at the moment, at least not until the difference is already more than what you've mentioned in your post, which are detailed textures, resolution and FPS.

Sorry if you don't get what you're looking for, or can't seem to see the relation between my post with your thread. This is just my opinion though.
Post edited November 01, 2015 by Abovet
avatar
Emachine9643: So what does GOG stand for now?
avatar
TARFU: If you ask me, I will always say "Good Old Games". :)
Btw, is that what GOG actually stands for?
avatar
TARFU: If you ask me, I will always say "Good Old Games". :)
avatar
Abovet: Btw, is that what GOG actually stands for?
When I first came to this site, it was indeed "Good Old Games".

I don't know the reason for reducing it to "GOG".

Maybe Judas could chime in here and give more info?
avatar
jamyskis: It's not like the PS2>PS3 transition...
I'd disagree with that. PS2 was designed with a small, blurry low res CRT TV in mind. PS3 was designed around a (roughly) 32" LCD screen. A massive step up. Even with all the TV enhancements since then we haven't seen that kind of a leap.

The question is: would a PS3 title look as good as a PS2 title does on a crappy, small old CRT screen if that PS3 title was reduced to a point where it would run on a PS2?

I think it probably would.
avatar
Abovet: Btw, is that what GOG actually stands for?
avatar
TARFU: When I first came to this site, it was indeed "Good Old Games".

I don't know the reason for reducing it to "GOG".

Maybe Judas could chime in here and give more info?
Ease of branding, plain and simple #gog gog.com GoG. The shortening was a necessity and it flows when spoken, the games are gog's its become a noun, and rightfully so.

And yes it was "Good Old Games" see one of the original .PNG files attached below.
Attachments:
avatar
TARFU: When I first came to this site, it was indeed "Good Old Games".

I don't know the reason for reducing it to "GOG".

Maybe Judas could chime in here and give more info?
I would have thought it obvious - they don't want to limit themselves to games that are old (or necessarily good, although that one was questionable even when they did use "Good Old Games").
low rated
avatar
TARFU: When I first came to this site, it was indeed "Good Old Games".

I don't know the reason for reducing it to "GOG".

Maybe Judas could chime in here and give more info?
avatar
SirPrimalform: I would have thought it obvious - they don't want to limit themselves to games that are old (or necessarily good, although that one was questionable even when they did use "Good Old Games").
"good" and "old" could be used in a slang form such as "We had a good old time at the carnival".

In that context, it doesn't mean that the time spent at the carnival was decades ago.

Or, as in the case of The Dukes of Hazzard theme song for example, "good old boys". The Duke boys are certainly not old.

Personally, I like "Good Old Games".