It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Like many PC gamers, I've been following the announcement and release of Nvidia's new generation of graphics processors, including the debates about inflated prices, and speculation regarding real time raytracing and other new technologies. I've also started watching a video playthrough of the PS4 exclusive game "Detroit: Become Human" just recently, and it got me thinking.

Frankly, the fidelity and realism of that game's visuals impressed me more than any PC game I've seen, including yet to be released games such as Cyberpunk 2077 ( talking strictly about the graphics, mind you ). The whole game pretty much looks like a pre-rendered cutscene, and it's all done in real-time on the limited hardware capabilities of the PS4. At best it has about the processing performance of a mid-range PC with a GTX 960 or GTX 1050ti?

It made me realize that PC gamers don't need fancy new $1000+ GPUs. Even real time raytracing seems like a gimmick, in its current form. What we really need are game developers with the resources, experience and vision to create truly remarkable games, which are properly optimized for the PC platform. I guess part of the problem is, that most AAA projects these days are primarily developed for consoles, and the PC version is merely an afterthought. But is it really that hard to optimize games for PC? Supposedly the current gen of consoles is closer to the PC platform than ever. Excuse me for putting on my tinfoil hat for a moment, but it would almost seem like messy PC optimization is being used to sell more overpriced hardware. The general excuse is, that it's harder to optimize games on PC because we lack a unified, standard hardware configuration, unlike consoles where each device has exactly the same hardware. But is this really such a big problem? I can understand the difficulties of ports for multiple operating systems, but on the hardware side, it's not like there are drastic differences in the way it operates... ? I mean, if a game is optimized for a GTX 1060 and a Core i5, it should only perform better on a GTX 1080 and a Core i7.

Do you know any games which actually utilize PC hardware in efficient ways? The DOOM reboot apparently has a very good and efficient graphics engine, especially via Vulkan. Also, Resident Evil 7 and Metal Gear Solid V are known for high visual fidelity and good performance. I think all of these use custom in-house engines, instead of popular general purpose engines, such as Unity or Unreal Engine. Maybe that is part of the secret? Engines which are specifically created for a certain type of game, rather than bloated jack-of-all-trades engines, such as Unity.

Feel free to share any thoughts of your own on these topics.
I'm not playing any recent AAA or other demanding titles, so I'll be short and say something I've heard, relatively recently I think, in some youtube video from one of the tech channels.

Games developed in partnership with AMD are usually better optimized than those made in partnership with Nvidia. So maybe keep this aspect in mind too and see if it's true.
avatar
CharlesGrey:
You are absolutely right about RTX but Nvidia has to offer something to please the PC masterrace and Bitcoinminers with money burning in their pockets.

It's progress man, just buy it!
The major console developers usually make the most optimized PC games.

Bethesda = DOOM, Wolfenstein, Fallout
Capcom = Resident Evil, Devil May Cry
Konami = Metal Gear Solid, Castlevania

Smaller studios and indie developers who try to make demanding 3D games usually have very poor optimization. Thats what i have noticed as a PC gamer.
avatar
CharlesGrey: It made me realize that PC gamers don't need fancy new $1000+ GPUs.
... unless they want to show off. Because if you want to show off, you can't use just about any decent hardware which is not shiny brand new and doesn't have any of the latest bells and whistles :P.
Post edited September 26, 2018 by WinterSnowfall
avatar
Strijkbout: It's progress man, just buy it!
:-D

Must buy must buy must buy... <walks in trance towards the electronics shop>

Hey, wake up me! My current graphics card is already about as good as I can use with my monitor and computer. No need for a new one unless I want one that draws less eletricity.
avatar
CharlesGrey: The general excuse is, that it's harder to optimize games on PC because we lack a unified, standard hardware configuration, unlike consoles where each device has exactly the same hardware. But is this really such a big problem?
Yes. Yes it is. I would have thought similarly to you probably less than a year ago, but I'm just settling into year 2 of a Game Dev and wow, you learn a lot about what goes on behind the scenes. Essentially, every single different piece of hardware is going to do things differently, and will need different optimizations.
So on some cards a certain OpenGL function will have been optimized to the moon and back, so it's faster to just call it. On other cards, you'll want to code the math yourself. There's also other things like available memory to take into consideration. The more you know about how a particular machine's memory is setup, the more you'll be able to optimize. Also you mention Vulkan, which is another good point as whether code is written for DirectX or OpenGL or Vulkan is all going to need to be optimized differently (plus will probably run differently based on the graphics card).
So ya, it's definitely not surprising that Devs can get much better optimization out of console hardware than PC stuff.
avatar
Strijkbout: You are absolutely right about RTX but Nvidia has to offer something to please the PC masterrace and Bitcoinminers with money burning in their pockets.

It's progress man, just buy it!
I have no issue with people buying expensive items if they have the money for it. Whatever, right? But the reasoning in that article makes me laugh: "Better spend big money now because, you know, there might be more games in the possibly somewhat near future that use this technology!!!"
avatar
CharlesGrey:
avatar
Strijkbout: You are absolutely right about RTX but Nvidia has to offer something to please the PC masterrace and Bitcoinminers with money burning in their pockets.

It's progress man, just buy it!
"If you can possibly afford one of the RTX cards -- even if it's not the most expensive model -- there's plenty of reasons to pull the trigger, either right now or after reading independent reviews. The time you spend waiting and complaining about it being overpriced is time you could be gaming with the most realistic user experience available."

Sounds like totally reasonable and unbiased advice! I think I'll go and buy three.
avatar
Heretic777: The major console developers usually make the most optimized PC games.

Bethesda = DOOM, Wolfenstein, Fallout
Capcom = Resident Evil, Devil May Cry
Konami = Metal Gear Solid, Castlevania

Smaller studios and indie developers who try to make demanding 3D games usually have very poor optimization. Thats what i have noticed as a PC gamer.
I don't know, it seems to be hit or miss. I remember Ubisoft had some badly optimized games on PC, and Capcom also had crappy ports in the past. Same for Square/Enix.

Also, I heard Capcom's Monster Hunter World has issues on PC.
avatar
CharlesGrey: It made me realize that PC gamers don't need fancy new $1000+ GPUs.
avatar
WinterSnowfall: ... unless they want to show off. Because if you want to show off, you can't use just about any decent hardware which is not shiny brand new and doesn't have any of the latest bells and whistles :P.
I think there are legitimate usage scenarios for these new GPUs, other than bragging rights. But yeah, for the average PC gamer I don't see much value in this generation of fancy-pants RTX hardware. Maybe next gen, or whatever AMD comes up with.
Post edited September 26, 2018 by CharlesGrey
The last time graphics got me excited must have been around 2006. I'm not saying things haven't improved since then but those improvements mean very little in terms of overall enjoyment.
avatar
CharlesGrey: The general excuse is, that it's harder to optimize games on PC because we lack a unified, standard hardware configuration, unlike consoles where each device has exactly the same hardware. But is this really such a big problem?
avatar
Leonard03: Yes. Yes it is. I would have thought similarly to you probably less than a year ago, but I'm just settling into year 2 of a Game Dev and wow, you learn a lot about what goes on behind the scenes. Essentially, every single different piece of hardware is going to do things differently, and will need different optimizations.
So on some cards a certain OpenGL function will have been optimized to the moon and back, so it's faster to just call it. On other cards, you'll want to code the math yourself. There's also other things like available memory to take into consideration. The more you know about how a particular machine's memory is setup, the more you'll be able to optimize. Also you mention Vulkan, which is another good point as whether code is written for DirectX or OpenGL or Vulkan is all going to need to be optimized differently (plus will probably run differently based on the graphics card).
So ya, it's definitely not surprising that Devs can get much better optimization out of console hardware than PC stuff.
So it's really that much of a mess? Is it just a matter of how much a dev team can spend on optimization, in terms of money and workforce? And I know that older hardware, such as older generations of GPUs can be wildly different from modern ones, but there shouldn't be much difference between cards from the same gen? I mean, most games only support a limited range of hardware anyhow, and as I mentioned in my earlier example, there shouldn't be much of a difference between a GTX 1060, 1070 and 1080 right? If a game is optimized for a GTX 1060, then the more powerful GPUs from that gen should "automatically" be able to handle it, or does it take further fine tuning?

By the way, is that Geralt in your icon or another character?
Post edited September 26, 2018 by CharlesGrey
avatar
Strijkbout: It's progress man, just buy it!
"there's value in being an early adopter. And there's a cost to either delaying your purchase or getting an older-generation product so you can save money."

Sure there is -- if you're an Nvidia shill. lol
I use Linux, so Nvidia is already smacked off the table and into a fiery pit.

One of the newest big games I have is Civ IV.

Hopefully brute force & ignorant programming gets stopped by some bottleneck soon.
avatar
hmcpretender: The last time graphics got me excited must have been around 2006. I'm not saying things haven't improved since then but those improvements mean very little in terms of overall enjoyment.
Hm, yes, I'm not as easily impressed by the visual quality of games as I used to be. Maybe because the jumps in quality are no longer as significant, compared to older game generations.

I do think graphics quality and art direction are a major factor for the impact a game has on you, but it depends on the genre/type of game.
avatar
Darvond: I use Linux, so Nvidia is already smacked off the table and into a fiery pit.

One of the newest big games I have is Civ IV.

Hopefully brute force & ignorant programming gets stopped by some bottleneck soon.
Is Nvidia and Linux a bad combo? I plan to use Linux on my next computer, but I'll probably keep using Windows for games anyway, at least for the near future.

And that's an interesting point: If hardware development gets to a point where it stagnates for technical reasons, I guess Devs would be forced to use proper optimization and clever coding tricks again, like back in the good old days. :P
Post edited September 26, 2018 by CharlesGrey
avatar
CharlesGrey: Hm, yes, I'm not as easily impressed by the visual quality of games as I used to be. Maybe because the jumps in quality are no longer as significant, compared to older game generations.

I do think graphics quality and art direction are a major factor for the impact a game has on you, but it depends on the genre/type of game.
Is Nvidia and Linux a bad combo? I plan to use Linux on my next computer, but I'll probably keep using Windows for games anyway, at least for the near future.

And that's an interesting point: If hardware development gets to a point where it stagnates for technical reasons, I guess Devs would be forced to use proper optimization and clever coding tricks again, like back in the good old days. :P
To say that Nvidia and Linux have a complicated relationship would be understating it. As seen here at an interview with Linus Torvalds.

Basically Nvidia refuses to play ball with the open source community, meaning your graphics card is absolutely gimped unless you use their proprietary driver (something which many don't like), and since Nvidia isn't playing ball, this makes it a very bad choice for many desktop environments and completely incompatible (currently) with Wayland, the Display server set to replace X.