It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
nipsen: ...
But to answer your question - obviously, yes. Been going on for a while.
avatar
apehater: interesting, can you list some games?
Practically everything? It's a rare exception when a programmer doesn't assume a convention on beforehand where scaling for different resolutions must result in decreased image quality, or a different cut-out of the static resources. You know you're around a AAA games-developer when they look at an html-page with objects that magically scale with round corners at any size, with correct and detailed text and labels - and they are so fascinated by it that they keep resizing the browser window for an hour. While their web-page uses 100% absolute references to place all the objects, including the ones that tile, via carefully handcrafted scripts that push a quad-core to the knees during every single redraw.

But the entire "dynamically scale the output after the renderer completes" thing has been used in games like COD4, various console ports, been part of rescue modes to get widescreen resolution support, and so on. Sony did it for some of the exclusive titles that they wanted to advertise as having 1080p. Lots of their games actually had a 1080p rendering target. But Resistance, for example didn't. You've heard the same jingle from other publishers and developers as well. It goes something like this: "yeah, the internal renderer does certain things to maintain framerate, but the target most definitely is HD!". You can basically name any title by a large developer where they have separate shader engineers who make generic effects supposed to bloom on particular spots - and the renderer they use will almost always scale the layers separately for different resolutions to restrain how much graphics card grunt you need. Instead of having specific shader effects with some sort of detail determination based on the viewport distance. For example. Because that would have to involve part of a completely different place in the production pipeline, which could be complicated.
looks like quantum break on xb1 will use an advanced resolution related dynamic scaling technique too, similar to shadow falls temporal reprojection.

"Quantum Break's 1080p output is a temporal reconstruction from four previous 720p 4x MSAA frames. This approach gets us high pixel quality in combination with complex shading and effects, allowing us to achieve a cinematic look. ..."
Oh, I thought this was referring to level scaling. I hate going into what should be a low level area encountering bandits with armor greater in value than the GDP of the kingdom's capital.
avatar
nipsen: ... You can basically name any title by a large developer where they have separate shader engineers who make generic effects supposed to bloom on particular spots - and the renderer they use will almost always scale the layers separately for different resolutions to restrain how much graphics card grunt you need. Instead of having specific shader effects with some sort of detail determination based on the viewport distance. For example. Because that would have to involve part of a completely different place in the production pipeline, which could be complicated.
i'm nor sure if i got it right, so resistance fall of man on ps3 used a resolution related dynamic scaling method, where the shaders are rendered at dynamic resolutions to keep the framerate stable? any link?
avatar
ShadowWulfe: Oh, I thought this was referring to level scaling. I hate going into what should be a low level area encountering bandits with armor greater in value than the GDP of the kingdom's capital.
i see now that there are a lot of meanings for scaling in videogames
Post edited March 25, 2016 by apehater
avatar
nipsen: ... You can basically name any title by a large developer where they have separate shader engineers who make generic effects supposed to bloom on particular spots - and the renderer they use will almost always scale the layers separately for different resolutions to restrain how much graphics card grunt you need. Instead of having specific shader effects with some sort of detail determination based on the viewport distance. For example. Because that would have to involve part of a completely different place in the production pipeline, which could be complicated.
avatar
apehater: i'm nor sure if i got it right, so resistance fall of man on ps3 used a resolution related dynamic scaling method, where the shaders are rendered at dynamic resolutions to keep the framerate stable? any link?
No, the other way around. The shaders are applied after the scene is rendered, so the run-time increases as resolution goes up exponentially. So to avoid performance issues, there's a target like 720p where it works. And you get all kinds of weirdness in different resolutions - if you don't add more filters to sort out the inaccuracies. So some of the layers scale after the scene is rendered, which will cause inaccuracies (unlike when it is scaled but adjusted to the viewport). And 90% of the people who see the game will never notice, because they play at the target resolution anyway, and think the on-rail sequences in Gears of War, and the cutscene closeups in cod are gorgeous anyway.. That the rest of the scene is the wireframe painted with wet newspapers, that's not a problem.

And the selling point is of course that with this method, they can get more and better effects in the game than they would have been able to otherwise. Which you hear very often. But what they don't address is that this is only the case with the way they've set up the production in the first place. This stuff is the backside of AAA development, where the production pipeline determines what the game will end up looking like. And it just costs too much and there's too much at stake to change tack. It's similar to how unoptimized sdk-packages are used consistently, so no one suddenly implements something no one else has access to -- when that's not what the sdk-developers intended of course. But there's no, or at least there wasn't, a team that sits and programs tools for the toolchain internally, makes targets for different platforms, and so on. Instead there was "get this on the most popular platform, then get someone else to port it". And generally, the result is "good enough". And also with fewer performance issues than more imaginative titles with curious effects (because they get shit press when the ports wouldn't look identical - this is what matters to the publisher. Predictable sales-figures rather than imaginative and interesting. Obviously). And sadly, now all effects in the game look painted on, or are literally always one-shot effects triggered at a specific locations and distances to look good. Which is, according to a very high amount of people, just how it's supposed to be.

So this is just business: if you don't demand gold - you get chocolate coins presented to you as gold instead. And if you can do this over and over again as well, and the customers come back for more chocolate coins. And if Jim Sterling blows a vein in his brain every time the first three seconds of a cutscene doesn't look fabulous, and Greg Miller docks titles 5 points for having perspectives he's unfamiliar with from the office-scape - then why would you change direction? Makes no sense to do that then.
avatar
nipsen: ...
ok, then i think i get it. you say there's less or nonexistent static game engine scaling in today's aaa titles as a result of greedy profit orientation and budget cuts.
avatar
nipsen: ...
avatar
apehater: ok, then i think i get it. you say there's less or nonexistent static game engine scaling in today's aaa titles as a result of greedy profit orientation and budget cuts.
Not really, no :p

It's a bit like lithium ion batteries in phones, for example. They're expensive to make compared to synthetic batteries. They're more likely to burn up, they lose the charge a lot more quickly, they're bigger, heavier, can't be constructed in flat sheets, they deteriorate on the shelf, they have a lower charge at more variable output. Etc. They're not very good compared to synthetic batteries (though a lot of "science" says they are, which is utter hooey). And eventually, in the last few years, actually, they started to become so cheap to manufacture that phone and laptop makers have generally just moved on to replace the lithium ion batteries with lithium polymer based batteries.

But many manufacturers still use the organic batteries, because they are easily available and the production lines for these batteries already exists. In the same way, the support cycle and support deals constructed around batteries that have to be consistently replaced, how batteries need to be primed at a specific time before shipping, and things of that sort actually make out a fairly large part of the ecosystem. So changing to something that actually will produce a more pleasing (and potentially cheaper) result for the customer would remove a part of the ecosystem the laptop and phone-manufacturers have, remove jobs, etc. So resisting that change isn't necessarily about greed, but about how the existing ecosystem works so well.

In AAA games development, you have the same thing with art assets, engine development and effects programming. When you organise a full production, then the more predictable things can be, the easier it is to figure out how many work-hours you need on art before you can start to implement without placeholders. And how much of this can be done concurrently, and so on. And with the method many of these companies employ, the costs are actually very large - so you want predictability in the production cycle. Activision has had hollywood blockbuster budgets on Call of Duty, for example (even if most of it went to promotion). And large parts of that went into constructing static art assets that are used once in the entire game. A lot of other companies do the same thing, and have artists construct amazing models - that then are rendered in much lower detail in the actual game, after they've been reduced properly to fit with the polygon budget. With output very often that I know the artists at least in some examples knew they would have been able to do better if they knew what the output was going to be, and how the model was going to be displayed - this is where really good artists are worth the money, when they can magick out something that is more visually pleasing than the polygon count suggests. But a more efficient solution was chosen, that wasn't necessarily the cheapest by any means. But it was predictable.

And the same goes for effects - if you structure out effects in a separate production line, then you can get people to spend a lot of time on the effects while other things are done in the production. But the effects are never going to be generated with an engine that can place these effects behind objects, or between characters, etc., without causing pretty massive performance issues. So you have these kind of invisible constraints to how the output of the game is going to be from the beginning. In order to get the production time and costs predictable.

And changing that could perhaps save the production costs by a lot. You can see indie-titles now that look extremely good in spite of not having the specially made static models designed to be seen in one specific angle in a cutscene, for example. So in many ways it's not that the publisher and the studio are greedy, it's that they've ended up with a method to create something specialized, that is supposedly top of the line -- that costs a lot of money, and that people apparently want to have. After all, like I said, people do like these super-detailed cutscenes, and don't care all that much if practically everything in the game either looks like a blur, and all the action invariably happens at a very specific distance from the player, with predictable amounts of enemies, and a selection of effects triggered in static one-shot animation sequences. And that can be constructed fairly quickly, and it's easy to document when it's finished and how the progress is going, etc.

But that's what AAA-development is - serial-produced titles with predictable output, cost and production time. And that's worth a lot of money, and jobs, and so on. And the studios that make games like this aren't going to start promoting the limitations of the engine tech here. That'd make no sense.

So when a studio develops something amazing by having two guys fiddling around with an engine for a couple of years, that, say, can manage to dynamically generate the detail from a very complex model in run-time, so there's no need for scaling. And it essentially removes the need for anti-aliasing, and allow any amount of high and low resolutions to be used for the actual output without loss in graphical fidelity. And you get more predictable framerate, you get better overall quality, etc. Then you're sort of having an industry with lithium ion batteries looking at someone with a lithium polymer battery in their phone, who are really happy, and don't see what all the fuss is about.

Or like an auto-industry that is going great. That suddenly has someone deploy a vehicle with a Wankel-engine. If the point was to create something functional that is more dynamic and easier to maintain - then this is superior in so many ways. But the truth is that "the market" doesn't want the product (because the cutscenes are great, and the 400w graphics cards seem awesome! Etc). And it cuts off large parts of the money-making industry as it exists, if that alternative solution became more popular.

Perfectly analogous to the auto-industry - you drive a Toyota at the speed-limit to work 5km away. But you still like the idea of a v8 and having top-speed at 350km/h enough to think a four-stroke is definitely the best. And sure, a lot of really good improvements have been made on smaller engines, to make them ridiculously more efficient and maintenance friendly than before. But a rotary engine from the 60s would still kick their ass. And we're not considering that as an option as either customers or manufacturers, for all kinds of reasonable, and not so reasonable grounds.

So in the AAA games industry, you now have:
1. Increasing costs in production make larger games with a lot of content untenable. You want shorter games with less content to justify the production at all.
2. Effects become really expensive hardware-wise, and the graphical fidelity is not improving as much as you would want. Increased popularity of dust effects and physics-based effects start to become impossible to put into a AAA title.
3. Resolutions increase on TVs and monitors. But the console resolution targets do not increase.

So something is slowly forcing itself up here. Then again - normal "TV broadcast resolution" is still 640p or something like that. It's completely possible to change it. But it means redeploying a lot of the existing infrastructure to do it. And in that example, TV broadcasts basically were replaced with internet on-demand, or a different service altogether, rather than the original one being changed. And I think that's probably what's going to happen to AAA-games development as well. It probably isn't going to disappear. But alternatives that the industry collectively scoffs at (not just in marketing, but also artists to programmers who owe their jobs to the ecosystem - same as games-journalists who exist there as well) are actually good enough to replace them, and there definitely exists a market for it.