It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
The last batch of UE5 games have all received criticism for poor performance, especially those with large worlds. UE5 is now pretty much labelled a bad engine by the gaming community but developers seem to welcome it, including CDPR who will develop Witcher IV with UE5 and likely Cyberpunk 2.

But when your target audience needs a new $1000 system for a good experience then maybe it's worth considering an alternative. I guess you could also argue that PCs should be cheaper than they currently are so that a midrange PC is more accessible.

Either that, or developers are not as good as they used to be at optimisation. Some of my favourites still look and run amazing and are tiny file sizes compared to modern titles.
high rated
avatar
botan9386: But when your target audience needs a new $1000 system for a good experience then maybe it's worth considering an alternative. I guess you could also argue that PCs should be cheaper than they currently are so that a midrange PC is more accessible.
In my opinion, the price is not even a concern; it is the energy consumption of new GPUs. It is beyond ridiculous that it is now regarded as the norm for dedicated graphics cards to have a TDP of over 200 watts... some, as high as 450 watts (uncertain of true consumption during full-utilization). A few years prior, I had made the decision to limit myself to only purchasing relatively efficient-efficient APUs (Accelerated Processing Units), which, fortunately, are now able to compete with (or, at the very least, match the performance of) some decent dedicated graphics cards.
avatar
botan9386: Either that, or developers are not as good as they used to be at optimisation.
Unfortunately, in general, optimization is no longer viewed as a top-priority (increasingly, it is not even considered). This is not simply an issue with game development; it is an ever-present problem within other areas related to programming. Developers are often not of the same caliber as those of past decades.

This decline can stem from blatant laziness, inexperience, unawareness, unrealistically tight deadlines, or, simply the sense that optimal function of the (software) product is no longer fundamentally necessary (as consumers are expected to continue purchasing energy-inefficient dedicated graphics cards).

Edit: elaborated on a few thoughts.
Post edited 20 hours ago by Palestine
I wish i could answer that question! I gave Immortals of Aveum a try for myself. It looked pretty okay. And we all know of course what happened to Black Myth Wukong. I dropped the First Descendant for the horrible loading times. And that's what i know about UE5 games.
avatar
botan9386: But when your target audience needs a new $1000 system for a good experience then maybe it's worth considering an alternative. I guess you could also argue that PCs should be cheaper than they currently are so that a midrange PC is more accessible.
avatar
Palestine: In my opinion, the price is not even a concern; it is the energy consumption of new GPUs. It is beyond ridiculous that it is now regarded as the norm for dedicated graphics cards to have a TDP of over 200 watts... some, as high as 450 watts (uncertain of true consumption during full-utilization). A few years prior, I had made the decision to limit myself to only purchasing relatively efficient APUs (Accelerated Processing Units), which, fortunately, now are able to compete with some decent dedicated graphics cards.
avatar
botan9386: Either that, or developers are not as good as they used to be at optimisation.
avatar
Palestine: Unfortunately, in general, optimization is no longer viewed as a top-priority (increasingly, it is not even considered). This is not simply an issue with game development; it is an ever-present problem within other areas related to programming. Developers are often not of the same caliber as those of past decades.

This decline can stem from blatant laziness, inexperience, unawareness, unrealistically tight deadlines, or, simply the sense that this is no longer fundamentally necessary (as consumers are expected to continue purchasing energy-inefficient dedicated graphics cards).
I never managed to get past a total system load of 550W myself, and that's with a 450W gpu.
I also rarely use more than 170Watts when gaming (gpu only, according to nvidia....)

The power consumption is rising noticeably though. I can't run cities 2 under 200W with medium low settings and a performance DLSS option... 30 fps atm..... I actually gotten used to a average of 140W in the previous years but as you just read, this figure increased with almost 30W
Post edited 20 hours ago by P. Zimerickus
My 1000 eu desktop pc is only for medium for stalker 2. Which i recently got on october 17th and the sytem reqiurements were posted latter.
Didn't get stalker 2 yet but i needed to buy a new pc due to laptop not working. And got it with win 11.
Now unreal 5 is known as a lag engine. I think one developer was mentioned that they did good optimization but don't know the game.
Also with a rtx 4060 nvidia geforce and the intel i5 12400f wasn't the best choice. But i was kinda not sure if getting a 14400f would be good since they had some voltage problems and 100% chip failiure that intel 13th and 14th generation had until some patch came. And our country electricity was supposed to cost more so a rtx 4060 was a choice because it uses less wattage
Post edited 20 hours ago by Fonzer
avatar
P. Zimerickus: I never managed to get past a total system load of 550W myself, and that's with a 450W gpu.
I also rarely use more than 170Watts when gaming (gpu only, according to nvidia....)

The power consumption is rising noticeably though. I can't run cities 2 under 200W with medium low settings and a performance DLSS option... 30 fps atm..... I actually gotten used to a average of 140W in the previous years but as you just read, this figure increased with almost 30W
P. Zimerickus, are these wattage figures obtained via software, or, from the wall outlet (using something akin to 'Kill A Watt')?
The latter is far more accurate.
While I would suggest that optimizations are something of a lost art, I think it's more the publishers who are to blame. When a sloppy game like the latest two mainline Pokemon games sell the best in their franchise in spite of looking and performing worse than most Wii titles, that just sets an unrealistic expectation for slop.
UE5 is a tool. It's up to everyone how they wield it. You can optimize, but it's much easier to just dump 100+ GB of data on people and call it a day. Optimization can get expensive, so many just choose to skip it and blame the engine for not doing it for them.

Going past optimization, Unreal Engine is still one of the main pioneer engines that often gets new features first. Miles better than something like Unity or pretty much anything else publically available for large scale projects.
Post edited 15 hours ago by idbeholdME
avatar
botan9386: But when your target audience needs a new $1000 system for a good experience then maybe it's worth considering an alternative.
Make that at least a 2000 dollar system and you're getting decent performance out of it.
And yeah, Unreal 5 is the becoming the new Unity 3D fast, problem is also that gamedevs want to put in all the advanced features which is what mostly tanks performance.
What are the alternatives with similar graphics quality and availability for about any dev studio?
Apart from bad optimization of already very demanding engines there is another problem coming up on the near horizon: Games that require RT (yes requiring it) and framegeneration as "standard" else it will either not work at all or even the best GPU may face defeat!

So... it is a very demanding time for GPUs and anything above lower midrange (above Intel Battlemage) is surely pretty thirsty.

However: 99% of the people will be unable to afford a 5090 so no need to worry its TDP... the most powerful "eligible" card is around 450 W peak value.

I would say... as of now the only cards that can 100% play ANY games are the AMD RDNA2 = 6000 series (the one inside a PS5) and Nvidia 2000 series. Even better to use AMD 7000 series and Nvidia 3000 series at least. Of course the Intel Arcs works too but i would get a Arc B580 (Battlemage) at least.

The "legendary" 1080 TI is soon "out of order"; with the exception of classic-games, so, nothing we could use for every comparison.

For the most demanding RT = path tracing, the current minimum is probably a 3090/3090 TI, 4070 TI or higher and 7900 XT or higher. Intel Battelmage is still not eligible... but good for almost any other RT task. A PS5 Pro may barely work as well but not without optimizations.

Not much will change with the newest generation from AMD and Nvidia, because AMDs RDNA4 may not be much stronger than a 7900 XTX/3090TI, yet it will simply cost lesser and with lesser power consumption.
The only one offering even better performance is still Nvidia only... and it comes at a PRICE... for sure! I mean, stuff like 5080... not even 4090... those go for 2500+ coins now... hehehe! 5090? Porsche drivers only!

AMD would be in big troubles but they has been lucky because:

1. Intel was nearly destroying themself
2. AMDs CPU is the best CPU ever and without the best CPUs a modern GPU can not work at its "peak" = Nvidia need AMD.
3. Nvidia is greedy as always, so AMD can offer a worse GPU for a better price without much issues
Post edited 7 hours ago by Xeshra
This youtube channel explains what's wrong with Unreal Engine 5 and modern graphics.


And many games from the late 2000s still look good today like Killzone 2 running at native 720p with 2x MSAA on PS3. We don't need blurry TAA, upscaling, or frame generation to get amazing graphics back then. Now we have games running below 720p on PS5, or worse, running below 480p resolution on Xbox Series S on some Unreal Engine 5 games!
avatar
Palestine: In my opinion, the price is not even a concern; it is the energy consumption of new GPUs. It is beyond ridiculous that it is now regarded as the norm for dedicated graphics cards to have a TDP of over 200 watts...
It begs the question if some of these GPUs are really "better" or if they're just using more power. AMD has shown that they can fit respectable GPU power inside of a 65 watt APU. I expect more powerful GPUs to require more power, but beyond 300 watts would be my limit on upper-mid to high-end.
avatar
Palestine: This decline can stem from blatant laziness, inexperience, unawareness, unrealistically tight deadlines, or, simply the sense that optimal function of the (software) product is no longer fundamentally necessary (as consumers are expected to continue purchasing energy-inefficient dedicated graphics cards).
I definitely sense that last part playing a big role. Upscaling and frame generation are being used as masks for poor performance recently but because GPUs have the technology it is assumed that we'll want to use those features.
avatar
ClassicGamer592: This youtube channel explains what's wrong with Unreal Engine 5 and modern graphics.

And many games from the late 2000s still look good today like Killzone 2 running at native 720p with 2x MSAA on PS3. We don't need blurry TAA, upscaling, or frame generation to get amazing graphics back then. Now we have games running below 720p on PS5, or worse, running below 480p resolution on Xbox Series S on some Unreal Engine 5 games!
I watched one of his videos earlier. Nice channel on optimization.

It is quite disappointing when games that don't look any better than 2000s titles are demanding so much more. As that guy explained it (I think), UE5 offers a quick workflow but the features in that workflow aren't properly optimised. So, if you're a developer who either can't fix the issues or has no time to then throw upscaling and FG at it.

On a side note, low LOD is another technique I'm seeing be used more blatantly. Things are popping around literally metres ahead...Come on.
avatar
botan9386: So, if you're a developer who either can't fix the issues or has no time to then throw upscaling and FG at it.
This is the core of the issue, yes. Time pressure and or wanting to cheap out by not hiring high level developers. It's much easier to hire a person and tell them "check these 2 boxes here and then do this" instead of someone who can modify the underlying engine code as needed and cook up a customized/modified solution for the project in question. And most importantly, it's much easier and cheaper to find somebody able to do work in a widely available engine tailored for large scale projects, than initiating someone completely from scratch into your in-house engine. When a developer can start working on the project immediately rather than having to go through a month long initiation to the inner workings of your custom engine before being able to start working, it saves a lot of money. That is also a major part of why a lot of studios are dropping their engines and switching to Unreal and there is pretty much guaranteed to be a steady supply of hirable, ready to go devs. Maintaining and updating/developing a custom, studio specific engine is extremely expensive.

But it's true UE focuses a lot on ease of use and may be pushing the still basically experimental features as the go-to/definitive solution a little too fast. Nanites and Lumen don't have to be used and may not always be the best solution for a specific scenario. But it's a choice by the developer, not really the fault of the engine. The features were first introduced with the launch of UE5, which is a mere 2.5 years ago. Look at Ray Tracing for example and how much time has passed before it started slowly becoming the norm.

Lastly, here are some highlights of the latest 5.5 version of the engine.
https://www.youtube.com/watch?v=BcmUZpdChhA
I'd say that the core issue is that code optimization has been basically sneered at from the formative years of today's devs, the programming classes and courses, the goal being to be fast and cheap, and proper optimization is quite the opposite, be able to work on as many alternatives (for lack of a better term - as in both the dev programming different things and in diffferent ways and the programs working on different platforms and hardware and instances) as possible, and, very importantly, be easy to modify and update, in keeping with today's insane update cycle, and well-optimized code tends to be harder to change, even more so while maintaining the optimization.
That, of course, applies to any programming field, games are just a part of it... And game engines just another...

And yes, the bigger problem is the power draw of GPUs. And even on top-tier CPUs, Intel's even more so.
Why I did something I kept saying I'll never do and switched away from them now that I finally got a new computer, with a Ryzen 8700G.
And yep, quite set on never having a dedicated GPU again. Made do with the integrated graphics on a poor Pentium G3440 for 9.5 years, now the 8700G is somewhere in the stratosphere compared to that and it should remain good enough for my uses for plenty of years to come.
Post edited 15 hours ago by Cavalary