It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
amok: But I still don't get it... I was under the impression that 1080p refers the the amont of lines drawn on the screen, and that they are progressively drawn. Should this not mean that the texture can consist of of 4 coloured pixels, as long as the complete picture is drawn with 1080 lines?
This would then not mean much, would it? You could as well leave it out and never talk about it.

Printing on the box "Can be seen also on 1080p screens." is like "Needs a computer to run." Pretty meaningless.

What customers need is something like an effective resolution depending on their system performance that tells them at which effective rendered (non-scaled) resolution their programm runs on their hardware.

Basically the OP complains that either

a) the advertisment is misleading
or
b) some games have terrible graphics
or
c) some games need too much hardware power to have decent graphics

or a mixture of them all.

avatar
ciomalau: if you have enough money you can just buy a better PC - what's the problem ...
The problem is knowing when to buy a new PC. I may not have enough money now but later, so I need some kind of information how good a programm will run. For example, my current PC is somewhere between minimal and recommended requirements for Witcher 3. What does it mean for the graphics quality? Which effective resolution can I run it on smoothlessly? Depending on this I would rather wait until I have enough money for buying the new PC or maybe I could already buy and play now.

It would be nice if there would be tools available that can give this needed information before buying. Kind of a game usable graphics resolution predictor. Doesn't sound trivial but doesn't sound impossible either.

Of course an easy way out is always only play five year old games on one year old hardware. That way one should be absolutely sure that graphics quality is as good as can be.
Post edited March 14, 2016 by Trilarion
avatar
amok: But I still don't get it... I was under the impression that 1080p refers the the amont of lines drawn on the screen, and that they are progressively drawn. Should this not mean that the texture can consist of of 4 coloured pixels, as long as the complete picture is drawn with 1080 lines?
avatar
Trilarion: This would then not mean much, would it? You could as well leave it out and never talk about it.

Printing on the box "Can be seen also on 1080p screens." is like "Needs a computer to run." Pretty meaningless.

What customers need is something like an effective resolution depending on their system performance that tells them at which effective rendered (non-scaled) resolution their programm runs on their hardware.

Basically the OP complains that either

a) the advertisment is misleading
or
b) some games have terrible graphics
or
c) some games need too much hardware power to have decent graphics

or a mixture of them all.

avatar
ciomalau: if you have enough money you can just buy a better PC - what's the problem ...
avatar
Trilarion: The problem is knowing when to buy a new PC. I may not have enough money now but later, so I need some kind of information how good a programm will run. For example, my current PC is somewhere between minimal and recommended requirements for Witcher 3. What does it mean for the graphics quality? Which effective resolution can I run it on smoothlessly? Depending on this I would rather wait until I have enough money for buying the new PC or maybe I could already buy and play now.

It would be nice if there would be tools available that can give this needed information before buying. Kind of a game usable graphics resolution predictor. Doesn't sound trivial but doesn't sound impossible either.

Of course an easy way out is always only play five year old games on one year old hardware. That way one should be absolutely sure that graphics quality is as good as can be.
some people here is talk about upscaling - i don't know if this counts but there is a game on gog tomb raider 1 which came out in 1996. however i play that on 1920 x 1080 resolution and i'm sure such high res didn't exist back then in 96
avatar
ciomalau: some people here is talk about upscaling - i don't know if this counts but there is a game on gog tomb raider 1 which came out in 1996. however i play that on 1920 x 1080 resolution and i'm sure such high res didn't exist back then in 96
That's why you can be sure that this resolution is an upscaled pseudo resolution, not the effective resolution or level of detail that is contained in the displayed images. That's why it's not very meaningfull to only count on the resolution of the monitor (it could even be a 4k pixel monitor). If the signal that reaches the monitor has not enough details / resolution then also a high resolution monitor cannot display a better graphics.

In principle your Tomb Raider 1 should not look (much) better than the original Tomb Raider 1 from 1996 on the screens available back then. Or does it?
avatar
ciomalau: some people here is talk about upscaling - i don't know if this counts but there is a game on gog tomb raider 1 which came out in 1996. however i play that on 1920 x 1080 resolution and i'm sure such high res didn't exist back then in 96
avatar
Trilarion: That's why you can be sure that this resolution is an upscaled pseudo resolution, not the effective resolution or level of detail that is contained in the displayed images. That's why it's not very meaningfull to only count on the resolution of the monitor (it could even be a 4k pixel monitor). If the signal that reaches the monitor has not enough details / resolution then also a high resolution monitor cannot display a better graphics.

In principle your Tomb Raider 1 should not look (much) better than the original Tomb Raider 1 from 1996 on the screens available back then. Or does it?
what i meant to say is that yes it looks much better. i can make a screenshot if you want to see
avatar
ciomalau: some people here is talk about upscaling - i don't know if this counts but there is a game on gog tomb raider 1 which came out in 1996. however i play that on 1920 x 1080 resolution and i'm sure such high res didn't exist back then in 96
avatar
Trilarion: That's why you can be sure that this resolution is an upscaled pseudo resolution, not the effective resolution or level of detail that is contained in the displayed images.
Not necessarily. Upscaled resolution means that the game is natively running at, say, 720p and then, during postprocessing, gets resized to 1080p to be compatible with modern screens. However, most 3D games, even early 3D games, are very scalable and can run @1080p natively - some without any fiddling at all, others with some changes. Naturally, you won't get improved textures or models that way, but the image will be a lot sharper - for instance, here's Unreal Tournament at 1080p native.

Here's a good example of an upscale - the game is realistically running at 1080p on both machines, but one upscales the image from 720p. Try to guess which :-P
Post edited March 14, 2016 by Fenixp
avatar
Trilarion: That's why you can be sure that this resolution is an upscaled pseudo resolution, not the effective resolution or level of detail that is contained in the displayed images.
avatar
Fenixp: Not necessarily. Upscaled resolution means that the game is natively running at, say, 720p and then, during postprocessing, gets resized to 1080p to be compatible with modern screens. However, most 3D games, even early 3D games, are very scalable and can run @1080p natively - some without any fiddling at all, others with some changes. Naturally, you won't get improved textures or models that way, but the image will be a lot sharper - for instance, here's Unreal Tournament at 1080p native.

Here's a good example of an upscale - the game is realistically running at 1080p on both machines, but one upscales the image from 720p. Try to guess which :-P
umm? hello. i went to ask you something - can i ask you somethink? there was guy here on forum who say windows 10 has no DOS? so i can't use dosbox if i install windows 10? i don't know much about it :(
avatar
ciomalau: umm? hello. i went to ask you something - can i ask you somethink? there was guy here on forum who say windows 10 has no DOS? so i can't use dosbox if i install windows 10? i don't know much about it :(
DOS is not direct part of the system since... Windows 2000? And DosBox is actually a standalone program which works regardless of presence of DOS in the system and it also runs on Windows 10. In other words, you don't need to worry about it :-)
Stop supporting these games. I don't buy those downgraded multiplatform games. Sadly I fell for witcher 3. But other than that I haven't bought any new AAA game to date. They're going to make games cheap for maximum audience to maximize profits. The only games PCers should be supporting are those games designed for PC only. Ported games are crap on pc. I don't understand why people love to frustrate themselves on broken unfinished games? Keep console for consoles and pc for pc.
avatar
Wolfehunter: Stop supporting these games. I don't buy those downgraded multiplatform games. Sadly I fell for witcher 3. But other than that I haven't bought any new AAA game to date. They're going to make games cheap for maximum audience to maximize profits. The only games PCers should be supporting are those games designed for PC only. Ported games are crap on pc. I don't understand why people love to frustrate themselves on broken unfinished games? Keep console for consoles and pc for pc.
i know but even Sudeki or Stolen weren't meant for PC still they gooooood (^_^) and Special Forces: Nemesis Strike is good too!
avatar
Wolfehunter: I don't buy those downgraded multiplatform games.
Resolution scaling has nothing to do with downgrading - it's a highly advertised feature

avatar
Wolfehunter: They're going to make games cheap for maximum audience to maximize profits.
Sooo you'd make expensive games for niche audience to minimize profits?

avatar
Wolfehunter: The only games PCers should be supporting are those games designed for PC only. Ported games are crap on pc. I don't understand why people love to frustrate themselves on broken unfinished games? Keep console for consoles and pc for pc.
So that we have to own 3 machines to play all games we're interested in? Don't think so. I'll gladly play my console games on superior hardware of PC which also supports a wide array of input methods, thank you very much. You feel free to ignore games developed simultaneously/ports.
avatar
Fenixp: ...Here's a good example of an upscale - the game is realistically running at 1080p on both machines, but one upscales the image from 720p. Try to guess which :-P
That is probably an very good example of what the original poster meant. The screen pixel density is 1080p but the information content on the right side remains 720p effectively which is arguably the more important value (as long as the screen features a resolution equal or above the content resolution that the game can produce). Only, it's not new. Seems like it's a rather old scam then.
avatar
Trilarion: That is probably an very good example of what the original poster meant. The screen pixel density is 1080p but the information content on the right side remains 720p effectively which is arguably the more important value (as long as the screen features a resolution equal or above the content resolution that the game can produce). Only, it's not new. Seems like it's a rather old scam then.
How is it a scam tho? It's not like anyone kept it a secret. People demanded visually superior games and as time moved on, developers found ways to squeeze more performance out of the outdated machines - lower FOV, lower resolution, big weapon models to obscure screen as a tradeoff for better texture resolution and generally greater visual fidelity. It would be a scam if publishers/developers claimed something different than what consumer was in reality purchasing, and I'm not saying that's not been happening, but certainly not due to upscaled resolutions and most certainly not due to dynamic resolution, which in my opinion is a fantastic piece of technology.
avatar
ciomalau: some people here is talk about upscaling - i don't know if this counts but there is a game on gog tomb raider 1 which came out in 1996. however i play that on 1920 x 1080 resolution and i'm sure such high res didn't exist back then in 96
avatar
Trilarion: That's why you can be sure that this resolution is an upscaled pseudo resolution, not the effective resolution or level of detail that is contained in the displayed images. That's why it's not very meaningfull to only count on the resolution of the monitor (it could even be a 4k pixel monitor). If the signal that reaches the monitor has not enough details / resolution then also a high resolution monitor cannot display a better graphics.

In principle your Tomb Raider 1 should not look (much) better than the original Tomb Raider 1 from 1996 on the screens available back then. Or does it?
Oooh! You're so close! :p All, right, not really.

But yes, Tomb-raider, a various number of crystal dynamics games, etc., can run in resolutions that are much higher than what you had available on a monitor back then. And still not get the "scaling" artefacts you expect. Early 3d games in general tend to have this curious feature.

Until you find that one single 256x256 static texture they used for a specific object that was difficult to draw with cubes and half-circles - and it looks like someone blew up a minecraft guy. But everything else looks fine, or at least as bad/good as it did back then. Fairly often you actually also see UI that has vector-graphics or is made up of scalable elements. So when you change the resolution, they might get a bit crunched, or stretch in a different way, but they're still functional and get stuck on the right side of the screen, etc.

On the other hand, when you start to get a little bit newer games than that, with much larger amounts of textures and static resources, they look absolutely horrible all round in higher resolutions. There are.. not many exceptions to that. Take the tie-fighter collector's cd-rom (the actual one, not the crap they re-released) for example. It could theoretically scale upwards (apart from the cockpit drawing), while the so-called "3d version" couldn't. Because that has very specific textures painted on the hull of all the ships to mask the crappy models, and once you start to scale that up, it just becomes ugly blocks.

The reason for all of that is:
1. All of those early games were made for graphics cards that had ram-sizes like this http://www.vgamuseum.info/images/palcal/3dfx/808_generic_3920465a1_600-0012-04_voodoo1_top_hq.jpg

Count the 512Kb chips. That's right - 4Mb. Who would possibly want to use more than that on /computer graphics/? :p Seriously, though -- In truth, drawing the objects that then are masked with textures are proportionally speaking smaller in the used ram now than they used to be back then.

2. After a certain point in time, no "full game" is ever on a size lower than at least one cd-rom. Worse, if the game was particularly large, it would still be on a cd-rom, just with more texture compression. Because now people start to replace geometry with static resources completely on purpose. It's like a resurgence of 2.5d, just with better sprites.

3. A lot of these early games were "multiplatform". Yes, that's right, they were made to run on different hardware and different resolutions (and form-factors) in the first place. Garage-developers with two people managed to account for that for the most part, between the dreamcast, pc, 2d and 3d-versions and different apis.

4. People didn't use static resources, and didn't build entire levels of reusable bits by hand, and then run it through the sdk so it puts 50.000 variants of a tree in different resource files to be loaded one after the other. You could probably say that, well, that's why the games are so ugly, right? That they don't have fifty million trees in separate models. No one wants to see the same tree used over and over again?

And then you run into something like randomly slotted tilesets. Or shadows on objects that actually are generated in real-time instead of crafted with a colour-filter plus a shader anchored to a fixed point (so that if you saw it outside the cutscene, it would look ridiculous). Or mobile games with real-time physics and weight and direction-based shaders. That weighs in at a whooping 100mb. And you realize that it is actually possible to make very complex looking geometry with just location data for all the squares and half-cubes before you even approach the cost of a single texture.

Meanwhile, the reason why you get these "remastered" games in upscale (unless they actually go back and redo or repackage the original resources). And the reason why people prefer that over actually increasing the resolution on "low res assets" is because the static resources don't scale well. They're just meant for a very specific context.

In a way, it's the precursor for the current industry: publishers who rather pay huge amounts of money to repackage the game for all the "targets" the game is supposed to release on (to justify their existence). Rather than spend money on having a developer create a toolchain that can output in different targets. This is 100% analogous to the current "ideal" where "AAA" developers tend to have very, very large budgets on pre-production and art assets. Even if they actually don't end up being used - or if they are used, they're not actually depicted in the detail the artist actually made them in (to fit it on the hardware limitations, and blabla - the simple engine of course had /nothing/ to do with it...). And where not using that approach is the domain of "indie developers". Even though, as we see more and more, those indie developers can create target outputs that are visually speaking very similar. And sometimes also much better and more believable - when also counting what happens outside the cutscenes.

Basically: static resources vs. geometry and scalars. Welcome to 1994, people. :)
avatar
nipsen: ... Basically: static resources vs. geometry and scalars. Welcome to 1994, people. :)
Very nice and detailed post. Thanks for it.

If I understand you correctly this transition of programmable, scalable geometry to static, fixed (high) resolution textures can be seen as a bit of evolution backwards that took place between the early onsets of 3D (1995-2000) and recently. In a way this seems rather bad.

On the other hand, it seems like creators are now trying to combine the best of both worlds, less fixed resolution textures and programmable upscaling. In principle this sounds good and contains the promise of superior quality even on really large number of pixels screens.

Unfortunately I guess you can still screw it up and instead of combining the best, combine the worst. For example having a 720p texture and upscale it to 1080p often kind of looks bad. Or for example games like Witcher 3 who appear on console and PC probably weren't optimized for PC and higher screen resolutions a lot, not using the best available textures and effects even if the PC the game is running on would be capable of handling them.

It all depends on the cases though. We could have a big comparison of 3D games of different ages and at different resolutions and try to estimate the visual quality somehow. At least it should have become clear that the pure screen output resolution doesn't say much about how good a game looks.
avatar
Fenixp: ...dynamic resolution, which in my opinion is a fantastic piece of technology.
Maybe scam is the wrong word then. Anyway I don't think one can easily prove wrong advertisment. This may happen once in a lifetime. I'm more interested in the capabilities of the technology itself.

And there I'm not so sure if dynamic resolution/upscaling is such a fantastic piece of technology. One would have to look at the visual results. One general principle of image processing is that you cannot generate more detail by scaling if it wasn't contained somewhere in the data before.

What I wanted to say is that it seems to be important to somehow define and judge games by an effective resolution, i.e. the level of details that is actually displayed when a game is running on a specific system. The idea would be that this is a much better measure of the graphical quality than the screen resolution. If there would be a way to define that it would be cool.