byrc: You do remember the first witcher right? Crazy ass load times, horrible sound glitches, stuttering, it was a mess. They fixed it. I bet you thought it ran like crud because they "future proofed it". So yeah I get your point about I can't assume my machine can run everything at max, but this isn't about that., its about optimization and performance. I bet your the same type of person who says the same thing when people had issues with dx11 on DA2. Guess what, that was Biowares fault. Not our equipment.
Okay, sure, there probably are optimisations that can be made, but this isn't the same deal as with the Aurora engine... that was a clunky engine, shoehorned into rendering a game it wasn't really designed to cope with.
This is a dedicated, well written engine. My point is, many people with an "above recommended spec" rig usually start ranting on about how badly coded something is the moment a game is released with "Ultra" settings that are beyond their hardware's capability.
Everyone did with Crysis... and it's an insult to the programmers. Just because they provide you the options to tweak, everyone seems to feel entitled to use them.
What if they'd just capped everything at "Medium", and called that "Ultra"? It'd still be an amazing looking game, everyone would be happy as larry, running the game at 60fps with everything maxed out... but why not open up options for those who can make use of them?
Sure, there are sometimes performance-sapping bugs that need weeding out, and they can quite often boost performance when they're solved. But in practice I find 90% of performance complaints come from people with capable-but-not-bleeding-edge computers, expecting to run on High or Ultra settings, and having a moan when they can't. It's certainly a far more common phenomenon than game-breaking slowdown due to bugs.