It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Sidewinder: I'm seriously wondering how many people think that $1300 monitors are commonplace enough to be indignant about lack of support. I personally can't justify that when I put my system together for a paultry $900 (and I get 40+ frames on W2 ultra details)
Sure, let those with high end systems rot! How dare they create such a system... Let's it be CD Projekt's official slogan: 'Quality games for 22" users!'
avatar
kendrix: @thachmai
i don't see why you're so fired up about such a minor issue. not to mention that one HD6970 isn't enough for a 2560x1440 resolution; at least not if you want decent FPS with high or max settings. i should know, because i also have 1xHD6970, with a phenom II X6 and 16 GB DDR3 (and have no problems playing the game). the only downside i see, is the fact that TW2 only recognizes 8 GB of ram (but i never expected it to use the whole amount in the first place).
i wouldn't play it at that resolution even if i had a 30" monitor; i would rather play it at a lower res with high visual settings, because it will look and run better.
also, Blackbeard mentioned a temporary fix for this. the truth is, this issue applies to many games, however most of them also come with a bucket load of other issues on release day, unlike TW2, which is pretty stable. one other example is DA2, which for me, always started in window mode (no matter what settings i applied). i had to access video options and activate and deactivate window mode for the game to go to full screen (every time i launched the game). the issue still applies to this day. was kind of annoying ? sure. but it wasn't that much of a big deal either.
oh, and you're pretty "quick" if you deice to waste time with reinstalling the game before trying out other graphic settings (which would've taken considerably less time).
Sure. Not being able to launch a game is a "minor issue". I'm really curious what your "important issue" list looks like!
Post edited May 18, 2011 by thachmai
Funny how this was never consdiered a real problemwhen a high end res was 1-24x768 and games were launched with only 640x480. Games are historically "behind the curve" in relationt o resolutions because honestly people that have $1300 to blow on a monitor are maybe 1% of the gaming population. Get over yourselves. A large number of pople have issues coming up with $300 for a decent video card, which is a far more vital component, just because you either make obscene amounts of cash or are a neckbearded momma's basement dweller doesn't matter, although this kind of attitude definately seams to be more common place among the neckbeards.


avatar
Sidewinder: I'm seriously wondering how many people think that $1300 monitors are commonplace enough to be indignant about lack of support. I personally can't justify that when I put my system together for a paultry $900 (and I get 40+ frames on W2 ultra details)
avatar
thachmai: Sure, let those with high end systems rot! How dare they create such a system... Let's it be CD Projekt's official slogan: 'Quality games for 22" users!'
avatar
kendrix: @thachmai
i don't see why you're so fired up about such a minor issue. not to mention that one HD6970 isn't enough for a 2560x1440 resolution; at least not if you want decent FPS with high or max settings. i should know, because i also have 1xHD6970, with a phenom II X6 and 16 GB DDR3 (and have no problems playing the game). the only downside i see, is the fact that TW2 only recognizes 8 GB of ram (but i never expected it to use the whole amount in the first place).
i wouldn't play it at that resolution even if i had a 30" monitor; i would rather play it at a lower res with high visual settings, because it will look and run better.
also, Blackbeard mentioned a temporary fix for this. the truth is, this issue applies to many games, however most of them also come with a bucket load of other issues on release day, unlike TW2, which is pretty stable. one other example is DA2, which for me, always started in window mode (no matter what settings i applied). i had to access video options and activate and deactivate window mode for the game to go to full screen (every time i launched the game). the issue still applies to this day. was kind of annoying ? sure. but it wasn't that much of a big deal either.
oh, and you're pretty "quick" if you deice to waste time with reinstalling the game before trying out other graphic settings (which would've taken considerably less time).
avatar
thachmai: Sure. Not being able to launch a game is a "minor issue". I'm really curious what your "important issue" list looks like!
I've had my 30 inch Dell monitor for 4 years now, and I run my games at 1680x1050, 1920x1200 or 2560x1600 depending on what my box can handle. My desktop has always been at 2560x1600.

To not be able to launch a game because the desktop res is set too high makes no sense.
And btw, I have a middle of the road puter, with an ancient mobo, core2duo E6600, 6gb ram, HD6870; and am running the game very happily at 2560x1600, with ubersampling off and DOF off (cos I don't like it). Rest is at max. Playing the prologue and all is peachy.

Sidewinder your comments make no sense: when games were "limited" to 640x480 (I presume you're talking about the good ol' 2D days when scalability was not an option) - my desktop res would usually be 1024x768 or 1152x864. Games would launch.

Secondly: I bought my expensive monitor years ago with the sweat of my brow. I work 50 to 60 hours a week, do night shifts, and put with a whole load of crap at work, and I buy things I want. Thanks for letting me use them properly.
Lol at the OP blaming a vendor for a game's technical shortcomings.
@Sidewinder: Your argument is rather nonsensical and dizzying. The fact stands that there are some Witcher fans who actually own iMacs or other high end monitors that can do 2560x1440. Every game I have played on my Apple display has worked PERFECT @ 2560x1440 INCLUDING The Witcher 1.

Fans have every right to inquire about an issue that was FINE with The Witcher 1 than is bizarrely BROKEN in the sequel. And what does the price of component X or Y have to do with bugs in the game? If a bug ONLY effected, say, an AMD Radeon 6990 (which are expensive and I'm sure many people can't afford) should the bugs just be ignored because a certain percentage of gamers don't own that exact card, lol?
Post edited May 18, 2011 by Ashok0
avatar
Ashok0: @Sidewinder: Your argument is rather nonsensical and dizzying. The fact stands that there are some Witcher fans who actually own iMacs or other high end monitors that can do 2560x1440. Every game I have played on my Apple display has worked PERFECT @ 2560x1440 INCLUDING The Witcher 1.

Fans have every right to inquire about an issue that was FINE with The Witcher 1 than is bizarrely BROKEN in the sequel. And what does the price of component X or Y have to do with bugs in the game? If a bug ONLY effected, say, an AMD Radeon 6990 (which are expensive and I'm sure many people can't afford) should the bugs just be ignored because a certain percentage of gamers don't own that exact card, lol?
Couldn't agree more.
I've already responded to someone else on this thread for having the same attitude displayed by Sidewinder. Rather bizarre that a gamer gets criticized because they happen to have a high end monitor. A large monitor with good resolution makes a huge difference in improving the gaming expreience.

And you are completely right about Witcher 1: I've had no problems at all running it on my monitor. Strange that this problem should crop up with Witcher 2.
It's one thing to point out a bug, it's another to go on a tirade about how horrific it is that it got missed when this issue affects a very small subset of the pc gamer base, and I'm sorry for pointing out that it makes it look like the people doing it have more money than sense.

it worked in Witcher 1 because they utilized an engine that already supported it. Witcher 2 is a completely new engine, and issues like that are to be expected, those resolutions are used by a very small portion of the gaming community and it's not surprising it got overlooked.

avatar
Ashok0: @Sidewinder: Your argument is rather nonsensical and dizzying. The fact stands that there are some Witcher fans who actually own iMacs or other high end monitors that can do 2560x1440. Every game I have played on my Apple display has worked PERFECT @ 2560x1440 INCLUDING The Witcher 1.

Fans have every right to inquire about an issue that was FINE with The Witcher 1 than is bizarrely BROKEN in the sequel. And what does the price of component X or Y have to do with bugs in the game? If a bug ONLY effected, say, an AMD Radeon 6990 (which are expensive and I'm sure many people can't afford) should the bugs just be ignored because a certain percentage of gamers don't own that exact card, lol?
Post edited May 18, 2011 by Sidewinder
avatar
Sidewinder: It's one thing to point out a bug, it's another to go on a tirade about how horrific it is that it got missed when this issue affects a very small subset of the pc gamer base, and I'm sorry for pointing out that it makes it look like the people doing it have more money than sense.

it worked in Witcher 1 because they utilized an engine that already supported it. Witcher 2 is a completely new engine, and issues like that are to be expected, those resolutions are used by a very small portion of the gaming community and it's not surprising it got overlooked.

avatar
Ashok0: @Sidewinder: Your argument is rather nonsensical and dizzying. The fact stands that there are some Witcher fans who actually own iMacs or other high end monitors that can do 2560x1440. Every game I have played on my Apple display has worked PERFECT @ 2560x1440 INCLUDING The Witcher 1.

Fans have every right to inquire about an issue that was FINE with The Witcher 1 than is bizarrely BROKEN in the sequel. And what does the price of component X or Y have to do with bugs in the game? If a bug ONLY effected, say, an AMD Radeon 6990 (which are expensive and I'm sure many people can't afford) should the bugs just be ignored because a certain percentage of gamers don't own that exact card, lol?
avatar
Sidewinder:
Sorry, but the only thing I get from your posts is that you are jealous that other people have money to spend on big monitors. I see no other reason for you to keep mentioning money in your posts. You even did it in this one by saying those compaining about this issue have 'more money than sense'.

I think it is safe to say that this problem will be fixed, though it does seem rather strange that this bug got through the testing process. After all, when I reduce my desktop resolution I am then able to run the game fine at 2560x1600 so the game is able to handle that resolution.

Also, I've seen a lot of posts on this issue at various sites on the web. I think you are mistaken about this issue only affecting a very small portion of the gaming community.
I always find it funny when people too cheap or too poor to upgrade to the latest technology fling mud at the people who have upgraded like they shouldnt. This technology (high rez monitors) has been out for years now, more people should have it, it makes gaming amazing to have 24 inch monitors and up
no one says it's not great to game at high resolutions (i also do that). also high res monitors are relatively cheap nowadays as well (esspecially 24" or 27").
however the person who opened the thread got all bent out of shape, missed the whole issue, then exaggerated it (at the beginning he actually believed that the game wouldn't run on that video card), and blamed GoG for something they were not responsible for (they don't test games on all possible hardware; that's the job of the QA team from the studio that developed the game).
and this issue isn't that big of a deal either. the game runs at 2560x1440. it won't start in that resolution, true, however only a small workaround is necessary here. and anyone with half a brain can figure it out.
is it a slight annoyance ? sure. is it worth blowing it out of proportions ? not really. and i'm pretty sure the issue will be resolved pretty soon.
This issue has been marked as "Resolved", but the insults don't seem to stop.


I'm not sure how you twist it, not being able to launch a game is the worst possible bug ever. I'm pretty sure the bug is incredibly silly and easy to fix. But it's really annoying for those affected.

The attitude that it's OK for a PC game to fail to launch, that user has to use their "superior intelligence" to solve the problem themself, is disturbing. I know some PC gamers have a superiority complex, but this tops it for me.


To those who think it's not OK to point out a serious bug in an otherwise excellent game: maybe you should go to work for CD Projekt! They obviously need your superior intelligence. After all, they failed to detect this nasty little bug in the first place.
Many problems have a solved when it's not solved, like Outcast, for example.

I have a desktop resolution of just 1280 x 1024, I have tried playing the game at that all the way down to 1024 x 768 and cannot get full screen. Prior to the 1.1 patch I did get full screen at 1280 x 1024, so it's something to do with the 1.1 patch, I believe!
Post edited May 28, 2011 by UK_John