Consoles are probably the best thing to ever happen for graphics.
When a console is made, it can't be made from industrial-powered super-mega-computer components. It has to be priced for a marketable consumer. So what the engineers do is they get all of the hardware put together and then they assemble it and give all of the specs to developers.
At first, the devs use the tricks that they know. They use the hardware like any traditional hardware and they get amazing results. But then the competitive nature of videogame publishing pushes developers to think outside of the box. Devs start looking at the actual processors and counting cycles for different procedures. They find gimmicky cheats to make things look similar to a high-processor-use function but in a low-processor-use way. They evaluate and suck every little bit of power out of the processor that they can possibly use. And then when the next system comes out, they use all of their tricks and then learn some more.
Unfortunately and fortunately, PCs have thousands of video cards and processors to pick from. I'd imagine there's just about the same variety of computers in the world as there are people (if not more). OpenGL, DirectX and the like talk to the processors and video cards as an in-between. So that 660 in a PS4 doesn't really have to run like a 660 in computer. The devs can cheat past the middleware and access the 660 itself. Some PC devs do this, but for the most part, they hang out with DirectX to communicate between their software and your hardware. This slows things down greatly.
But DirectX and other middleware keeps getting better because DirectX can use some of the cheats or shortcuts that devs are figuring out with consoles (and certainly improve through their own various research experiments).
Not only that, but your drivers get better too. Drivers can be enhanced and often emulate research done for console games. In fact, over the life of a graphics card, drivers can often boost the effectiveness of a graphics card by 10% or more.
Now, I admit not all of this research and development is due to consoles, but I would argue that if consoles didn't exist, then most publishers would say "You want better graphics? Buy a better card!" instead of putting in the blood, sweat, tears, engineering and creative brilliance that it takes to make a graphics card sing in resonance.
nightcraw1er.488: I remember the heady days of pokeing 8x8 square of bits on the speccy :o) Yes alright, I remember before those days too.
jamyskis: Is it sad that I could actually visualise how the UDG was going to look from the DATA line before he actually ran it? (Cue jokes about how "I don't even see the code anymore, all I see is blonde, brunette, redhead")
One thing that really pissed me off about Spectrum BASIC development was the way it had coordinates switched around: coordinates were always specified as (y,x) instead of (x,y). Didn't really make much of a difference when programming in Z80 assembler, but it was a spectacular pain in BASIC.
I learned all of my processor coding with non-Intel. And then after that, I learn that Intel is the only chip that reverses all of its bits.
Yay for them, I guess. I'm using one right now. :)
By the way, all I see is a blond, brunette, redhead... (SPOILER ALERT: and I will murder you in your sleep so I can eat steak!)