It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I got past the griffon using a GT 730 (slightly overclocked) and 4GB of ram(everything dirt low settings mind you). I do hope for better optimization though because after that area the performance just tanks for me and don't get me started on when it the rains ugh :(
avatar
baktanus: Whoa. I'm running it at high settings with 50-60fps on my 750ti on a 1366x768 monitor. I would say this game is optimized. If I had 1 780, I believe I can run it on ULTRA on a 1080p monitor.
Unfortunatley not, I do have a 780.
I am running on medium@1080p and the game struggles to stay at 60FPS. It usually stays in between 45-60FPS, though if I lock it to 30 frames it'll be in between 20-30FPS.

For some reason the game seems to have issues to maintain a stable framerate on my setup. I'd love to see how it'd perform without frame limit, but unfortunatley I can't find that option.

Edit: Though changing the settings to high doesn't seem to affect the framerate at all.
Post edited May 19, 2015 by JakeTweet
avatar
JakeTweet: For some reason the game seems to have issues to maintain a stable framerate on my setup. I'd love to see how it'd perform without frame limit, but unfortunatley I can't find that option.

Edit: Though changing the settings to high doesn't seem to affect the framerate at all.
The reason is that Nvidia didn't optimize the drivers for older Kepler based cards, here is an excerpt from a recent benchmarks overview highlighting the issue:

"The GTX Titan and GTX 780 also perform quite poorly, falling behind their Radeon counterparts the 290X and 290 by nearly 50%. We see this trend with older graphics cards from Nvidia’s GTX 500 series and AMD’s HD 6000 series as well. Which indicates that Nvidia’s Game Ready drivers are only optimized for Maxwell based GTX 900 series products. And will need some serious optimization work to bring the Kepler based 600 and 700 series products up to par."

For more info here is the link to the full article: http://wccftech.com/witcher-3-initial-benchmarks/

avatar
darthspudius: Man this game is choking w/ my GTX 760 on Low. The only game iv had any problem with so far. So sad.
Thats unfortunate, I hope Nvidia releases proper drivers for the Kepler based cards like GTX 760 as well.
Post edited May 19, 2015 by stg83
The post I made on another thread:

With the below settings and setup I get a 43-55 fps and the game looks great, however it could do with a little more optimization (if there's room for it).

My setup:

Core i5 2400 CPU
EVGA Nvidia GTX 760
8GB Ram
Windows 7

Game settings - In Graphics: Everything on Medium, Textures set to high, Hairworks off. full screen 1680x1050, vsync off.

Postprocessing:

Motion Blur - off
Blur - off
AA - on
Bloom - on
Sharpening - on
Ambient Occulsion - none
Depth of field - off
Chromatic Aberration - off
Vignetting - off
Light shafts - off
Post edited May 19, 2015 by Ganni1987
avatar
stg83: The reason is that Nvidia didn't optimize the drivers for older Kepler based cards, here is an excerpt from a recent benchmarks overview highlighting the issue:

"The GTX Titan and GTX 780 also perform quite poorly, falling behind their Radeon counterparts the 290X and 290 by nearly 50%. We see this trend with older graphics cards from Nvidia’s GTX 500 series and AMD’s HD 6000 series as well. Which indicates that Nvidia’s Game Ready drivers are only optimized for Maxwell based GTX 900 series products. And will need some serious optimization work to bring the Kepler based 600 and 700 series products up to par."

For more info here is the link to the full article: http://wccftech.com/witcher-3-initial-benchmarks/
Thank you for linking this. At least now I know the motives why I'm pissed.
I was wondering why the framerate was total crap (because really, in 2015 40 fps are unacceptable crap) on my rig (which, btw, is exactly matching the recommended requirements) while playing on medium settings. MEDIUM.
What a wonderful start!
avatar
lebedkirillseven: Hello, everybody!
I just want to quickly say that I am running a set-up like that:

Intel Core i7 3770K
2x Geforce GTX 780 SLI
16Gb Corsair Vengeance RAM
The game is installed on a SSD

which is more that recommended specs (single Geforce GTX 770)

On 'ultra', the game runs at 20 fps on my machine, so I am playing with everything tuned down to 'low'.
And I mean, what the hell? Witcher 2 had bad optimisation, but not that bad.
I hope that developers will see this post and take note, that they should, maybe, look into it.
And please, refrain from posting messeges like "Yo PC sucks, run quad-SLI Titans X".

Graphical downgrade is not a problem, the problem is when you have arms going out of your b****t.
I'm running on a single GTX 960 and performance is quite good. Sounds like there's something wrong with SLI performance... ? Does turning off hairworks have a notable impact? Supposedly it doesn't work well on non Maxwell GPUs.

Also, what resolution are you using? In theory your system should have plenty power to run the game.
Post edited May 19, 2015 by CharlesGrey
avatar
JakeTweet: For some reason the game seems to have issues to maintain a stable framerate on my setup. I'd love to see how it'd perform without frame limit, but unfortunatley I can't find that option.

Edit: Though changing the settings to high doesn't seem to affect the framerate at all.
avatar
stg83: The reason is that Nvidia didn't optimize the drivers for older Kepler based cards, here is an excerpt from a recent benchmarks overview highlighting the issue:

"The GTX Titan and GTX 780 also perform quite poorly, falling behind their Radeon counterparts the 290X and 290 by nearly 50%. We see this trend with older graphics cards from Nvidia’s GTX 500 series and AMD’s HD 6000 series as well. Which indicates that Nvidia’s Game Ready drivers are only optimized for Maxwell based GTX 900 series products. And will need some serious optimization work to bring the Kepler based 600 and 700 series products up to par."

For more info here is the link to the full article: http://wccftech.com/witcher-3-initial-benchmarks/

avatar
darthspudius: Man this game is choking w/ my GTX 760 on Low. The only game iv had any problem with so far. So sad.
avatar
stg83: Thats unfortunate, I hope Nvidia releases proper drivers for the Kepler based cards like GTX 760 as well.
I hope so too, this wasn't well thought out by the looks of it.
Bad optimisation? No way! I thought i will be play on medium or low settings but no - i play it on ultra (only hair works is off, field of view of grass,etc., and backgrounds characters on high, i dont want to see my pc on fire :) ) and i got 60Fps everywhere.
My specs are:
Intel I7 4770K 3.5GHZ
Gigabyte GTX 780 TI OC (not a faulty/bad one)
16GB RAM KIngston HyperX
Game on HDD not on SDD.

And why everyone say that graphic is bad, this game is beatiful, thanks CDP-R for doing the good job. :)

Edit: I changed BChar. to ultra for test, and still got 60FPS. :)
Post edited May 19, 2015 by Shofixti1227
avatar
Potzato: People always assume that 2 cards (SLI, crossfire ..) is better than one in every possible scenario.
They don't realize that most of the time you just gain processing power (you can deal with more/better effects) at the expense of PCIe Bandwidth (less capacity to transfer textures). SLI/crossfire is not the miracle solution GPU sellers would like you to believe it is.

Is it really worse on a single card ?

(Oh, and there is a subforum for this kind of thread)
Yet I've looked at hundreds of benchmark tests and have not seen a single one that I can recall, where machines with 2 cards didn't test faster for in-game framerates than identical setups with one card.

Got any links?

ADDED: And yeah, I can imagine someone, somewhere, will find one or two examples, but I'm betting it's like 95 to 99 percent of the time faster with 2 cards. Every game I've ever played, including a 2002 copy of Nascar Racing 2002 (which was before SLI wasn't it???) runs faster in SLI than without.
Post edited May 19, 2015 by OldFatGuy
Thanks for pointing that out stg83, I hope nvidia comes around and fixes this mess with the Kepler series, no way in hell a GTX Titan (Classic) gives only 2 frames more than the 960.

At least now I know the now 4 year old Sandy Bridge CPUs can still pack a punch in today's high end games.
So did some messing around with the game last night, and it definitely has to be a problem with SLI. I have a single Gigabyte R9280X OC, and I could get 25-30 fps out of it with everything on ultra, sans Hairworks and motion blur. Ofc it's not exactly butter smooth, not using these presets, just wanted to see what's going on.

Right now set foliage distance to high and shadows to high from ultra, disabled AA, and managed to clock in 45-55, depending on how much is going on. Still, I have to say this is pretty far from badly optimized, if a single entry level high tier card can bring the pain like this :)

Oh right, CPU is an i5-4570k and only using 8GB of DDR3 for RAM.
avatar
Potzato: People always assume that 2 cards (SLI, crossfire ..) is better than one in every possible scenario.
They don't realize that most of the time you just gain processing power (you can deal with more/better effects) at the expense of PCIe Bandwidth (less capacity to transfer textures). SLI/crossfire is not the miracle solution GPU sellers would like you to believe it is.

Is it really worse on a single card ?

(Oh, and there is a subforum for this kind of thread)
avatar
OldFatGuy: Yet I've looked at hundreds of benchmark tests and have not seen a single one that I can recall, where machines with 2 cards didn't test faster for in-game framerates than identical setups with one card.

Got any links?

ADDED: And yeah, I can imagine someone, somewhere, will find one or two examples, but I'm betting it's like 95 to 99 percent of the time faster with 2 cards. Every game I've ever played, including a 2002 copy of Nascar Racing 2002 (which was before SLI wasn't it???) runs faster in SLI than without.
There are two things :
1) In practice, when you use 2 cards instead of one, you never have twice the performance, it's as simple as that. It's mostly due to bandwidth and synchronization.
2) Tests usually check highest settings and have the 'good motherboard', so the tradeoff bandwidth/processing power is in favor of SLI. I don't have any links, but on lowest settings (but with big textures) you might see a loss of perf with SLI. Using SLI on a sub par motherboard can cause perf loss too.

Edit : to be clear I didn't say SLI is a lie and is worse than anything, just saying there are limitations, and in my honest and very personal opinion it isn't worth the extra energy consumption.

avatar
hunvagy: and it definitely has to be a problem with SLI.
Yes, I know it's a bit of a shame, but next driver/patch will get things sorted. As usual with high profile games :-)
Post edited May 20, 2015 by Potzato
avatar
lebedkirillseven: Hello, everybody!
I just want to quickly say that I am running a set-up like that:

Intel Core i7 3770K
2x Geforce GTX 780 SLI
16Gb Corsair Vengeance RAM
The game is installed on a SSD

which is more that recommended specs (single Geforce GTX 770)

On 'ultra', the game runs at 20 fps on my machine, so I am playing with everything tuned down to 'low'.
And I mean, what the hell? Witcher 2 had bad optimisation, but not that bad.
I hope that developers will see this post and take note, that they should, maybe, look into it.
And please, refrain from posting messeges like "Yo PC sucks, run quad-SLI Titans X".

Graphical downgrade is not a problem, the problem is when you have arms going out of your b****t.
Did you download the drivers?
avatar
phaolo: Witcher 3 is badly optimized, it doesn't work ultra on my toaster :\
you just have to use 2 toasters on an sli connection
avatar
apehater: you just have to use 2 toasters on an sli connection
Does this work with two Apple Watches too?