It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I hate to be that person, but I have to say

I kinda agree with Yathzee on this one: Graphics have gotten just about as good as they can get, so let's stop focusing so much on them

I've always felt that graphic STYLE is much more important than graphic DETAIL.

My personal opinion is that the PS2 game Okami is still the best looking game ever made, even though graphical technology has long since improved. While the detail is lower than current games, the stylized approach really makes it stand out and a pleasure to look at almost 10 years later

It's more important to me that developers make a game where the graphics complement the gameplay and the aesthetics of the game to make it a more holistic experience , as opposed to just constantly increasing the number of pixels and frames per second.
Post edited October 28, 2015 by TheTome56
To the thread title I say.... Good :p saves me having to fork out a lot of money to upgrade my geforce 660gtx which so far has impressed me since the day I bought it.
avatar
jamyskis: That's not ASCII. This is ASCII.
That's ASCII text. This is ASCII graphics :)
avatar
jamyskis: That's not ASCII. This is ASCII.
avatar
Barefoot_Monkey: That's ASCII text. This is ASCII graphics :)
Bah, that's so modern. ASCII graphics is this:
https://www.youtube.com/watch?v=Eoom2dGnHJw
I remember the heady days of pokeing 8x8 square of bits on the speccy :o) Yes alright, I remember before those days too.
Dunno if the whole indie thing also factors in - given that they often don't have the level of resources necessary to create games with cutting-edge graphics - but I'm quite glad that hyper-realism is a smaller part of the marketplace these days. It's nice that those who want that can get it in a large number of titles, but it's equally nice that you can find a ton of games that don't bring about a heat stroke in your computer.

In other words, there is room - and a market - for both, and everything in-between.
avatar
nightcraw1er.488: I remember the heady days of pokeing 8x8 square of bits on the speccy :o) Yes alright, I remember before those days too.
Is it sad that I could actually visualise how the UDG was going to look from the DATA line before he actually ran it? (Cue jokes about how "I don't even see the code anymore, all I see is blonde, brunette, redhead")

One thing that really pissed me off about Spectrum BASIC development was the way it had coordinates switched around: coordinates were always specified as (y,x) instead of (x,y). Didn't really make much of a difference when programming in Z80 assembler, but it was a spectacular pain in BASIC.
Post edited October 28, 2015 by jamyskis
avatar
jamyskis: There are 16-bit console and DOS games out there that look just as good today and are just as immersive today as they did and were back then. Why else would indie games strive to imitate them?
because they're cheap hacks?
avatar
jamyskis: Is it sad ...snip
You know the answer to that ;o)
avatar
Spectre: because they're cheap hacks?
Riiiiiight...
Consoles are probably the best thing to ever happen for graphics.

When a console is made, it can't be made from industrial-powered super-mega-computer components. It has to be priced for a marketable consumer. So what the engineers do is they get all of the hardware put together and then they assemble it and give all of the specs to developers.

At first, the devs use the tricks that they know. They use the hardware like any traditional hardware and they get amazing results. But then the competitive nature of videogame publishing pushes developers to think outside of the box. Devs start looking at the actual processors and counting cycles for different procedures. They find gimmicky cheats to make things look similar to a high-processor-use function but in a low-processor-use way. They evaluate and suck every little bit of power out of the processor that they can possibly use. And then when the next system comes out, they use all of their tricks and then learn some more.

Unfortunately and fortunately, PCs have thousands of video cards and processors to pick from. I'd imagine there's just about the same variety of computers in the world as there are people (if not more). OpenGL, DirectX and the like talk to the processors and video cards as an in-between. So that 660 in a PS4 doesn't really have to run like a 660 in computer. The devs can cheat past the middleware and access the 660 itself. Some PC devs do this, but for the most part, they hang out with DirectX to communicate between their software and your hardware. This slows things down greatly.

But DirectX and other middleware keeps getting better because DirectX can use some of the cheats or shortcuts that devs are figuring out with consoles (and certainly improve through their own various research experiments).

Not only that, but your drivers get better too. Drivers can be enhanced and often emulate research done for console games. In fact, over the life of a graphics card, drivers can often boost the effectiveness of a graphics card by 10% or more.

Now, I admit not all of this research and development is due to consoles, but I would argue that if consoles didn't exist, then most publishers would say "You want better graphics? Buy a better card!" instead of putting in the blood, sweat, tears, engineering and creative brilliance that it takes to make a graphics card sing in resonance.


avatar
nightcraw1er.488: I remember the heady days of pokeing 8x8 square of bits on the speccy :o) Yes alright, I remember before those days too.
avatar
jamyskis: Is it sad that I could actually visualise how the UDG was going to look from the DATA line before he actually ran it? (Cue jokes about how "I don't even see the code anymore, all I see is blonde, brunette, redhead")

One thing that really pissed me off about Spectrum BASIC development was the way it had coordinates switched around: coordinates were always specified as (y,x) instead of (x,y). Didn't really make much of a difference when programming in Z80 assembler, but it was a spectacular pain in BASIC.
I learned all of my processor coding with non-Intel. And then after that, I learn that Intel is the only chip that reverses all of its bits.

Yay for them, I guess. I'm using one right now. :)

By the way, all I see is a blond, brunette, redhead... (SPOILER ALERT: and I will murder you in your sleep so I can eat steak!)
Post edited October 28, 2015 by Tallima
Like some people in this thread have pointed out, I too agree that graphics art style is more important than how technically advanced the visuals are. I mean some of the most atmospheric and memorable games for me relied on a unique visual style. Games like Dishonored (painting look), Borderlands (Cell shaded), Far Cry 2/3, Bioshock, Riddick, Oblivion just to name a few. It's true that you don't need Cyrsis level graphics to make a gave appealing.

I think graphics in new games can improve in some areas like:

Dynamic reflections on glass surfaces and wet surfaces - Despite the year 2015, developers are still using pre-drawn baked in reflections in games, which seems a bit off to me because they seem out of place in more technically advanced games.I mean what the hell is that?

Who would love to see dynamic reflections implemented by next gen?

Also, many games are guilty of using few NPC character models, including Witcher 3 and Bethesda games. I mean It's 2015, come on, make more than 4-5 faces. I see like the same guy everywhere in Witcher 3 with different hair styles. This might take some players out of the experience.

High definition/detailed textures should be made mandatory by next gen at least, if not this gen ! Games like Mad Max and GTA 5 look amazing when you take the whole picture in, but the textures look ugly up close and remind me of early last gen games.

Do you guys agree with me on this? I'm sure many will agree about the baked in reflections.
Post edited October 29, 2015 by doomdoom11
avatar
doomdoom11: Like some people in this thread have pointed out, I too agree that graphics art style is more important than how technically advanced the visuals are. I mean some of the most atmospheric and memorable games for me relied on a unique visual style. Games like Dishonored (painting look), Borderlands (Cell shaded), Far Cry 2/3, Bioshock, Riddick, Oblivion just to name a few. It's true that you don't need Cyrsis level graphics to make a gave appealing.

I think graphics in new games can improve in some areas like:

Dynamic reflections on glass surfaces and wet surfaces - Despite the year 2015, developers are still using pre-drawn baked in reflections in games, which seems a bit off to me because they seem out of place in more technically advanced games.I mean what the hell is that?

Who would love to see dynamic reflections implemented by next gen?

Also, many games are guilty of using few NPC character models, including Witcher 3 and Bethesda games. I mean It's 2015, come on, make more than 4-5 faces. I see like the same guy everywhere in Witcher 3 with different hair styles. This might take some players out of the experience.

High definition/detailed textures should be made mandatory by next gen at least, if not this gen ! Games like Mad Max and GTA 5 look amazing when you take the whole picture in, but the textures look ugly up close and remind me of early last gen games.

Do you guys agree with me on this? I'm sure many will agree about the baked in reflections.
Personally I don't spend an awful lot of time staring at reflections in a game, so meh on that. but I know what you mean about the two or three models with slight differences, even same voice acting. Could be better, even back on the sprite days there was more variety.
avatar
doomdoom11: ...Also, many games are guilty of using few NPC character models, including Witcher 3 and Bethesda games. I mean It's 2015, come on, make more than 4-5 faces. I see like the same guy everywhere in Witcher 3 with different hair styles. This might take some players out of the experience. ...
Yeah, this is also for me one of the main points that I notice that take me out of the immersion. I wonder if maybe procedurally one could alter face and body models to generate more variety algorithmicly if it is too much effort to do it manually.

Anyway this has nothing to do with the graphics card abilities because it only cares about the level of details and complexity of the effects not about the variety of content it displays. This is "just" a problem with development.
I like games based on benchmark engines. I recently played Orb of Dilaaria made on Wolf3D engine. Great game, my point being that there's nothing wrong with reusing "old" material to focus on making a good experience.
avatar
ShadowWulfe: I like games based on benchmark engines. I recently played Orb of Dilaaria made on Wolf3D engine. Great game, my point being that there's nothing wrong with reusing "old" material to focus on making a good experience.
The Quake 2 and Quake 3 engines (aka id Tech 2 and id Tech 3) say hello. Those engines were pretty much only good for Quake 2 and Quake 3 - pretty much every game beyond that was ruined by id Tech's quirks. Especially games that weren't first-person shooters (e.g. Alice, Heretic 2).

More modern middleware solutions like CryEngine or Unreal Engine 3 & 4 have made much greater diversity possible and advisable, but the use of engines like Unreal 1 and id Tech 2 and 3 caused a lot of games to suffer due to the engine not really being suitable for them.
Post edited October 30, 2015 by jamyskis