It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Since you guys (CDPR) ignore my more "private" messages I'll write right here. ;)

1. Exactly what budget Witcher 3 had? Early rumours were about 15M USD for development and 20M for marketing, now it's 32M and 35M respectively.

2. Why you stated i7 4790k, 16Gb of RAM and GTX980 as required for ultra settings, while your game utilizes under 20% of CPU power, around 3 GB of RAM, and, though I don't have GTX980, but R9 290x, even single GPU is not loaded to stable 100%, and its power consumption is about 60% of maximum. Moreover, Witcher 3 is the only game that I have, that have worse performance with Crossfire enabled 35 FPS on average vs 45 with single GPU.

No need to send me under the bridge, I'm not hater, I do not imply downgrade, or bad optimization, I'm merely interested in getting answers on those two questions. /grin
P.S. Proper name for Witcher 3 should be "Witcher 3: Ciri gone wild". :p
P.P.S. Why so serious?
Damn, I didn't know marketing budget was so high ><

I also asked myself about the CPU requirement. I were afraid but finally it's not so cpu intensive.
(Besides, do we know how much threads the games uses? I don't find this information).
avatar
RudyLis: Since you guys (CDPR) ignore my more "private" messages I'll write right here. ;)

1. Exactly what budget Witcher 3 had? Early rumours were about 15M USD for development and 20M for marketing, now it's 32M and 35M respectively.

2. Why you stated i7 4790k, 16Gb of RAM and GTX980 as required for ultra settings, while your game utilizes under 20% of CPU power, around 3 GB of RAM, and, though I don't have GTX980, but R9 290x, even single GPU is not loaded to stable 100%, and its power consumption is about 60% of maximum. Moreover, Witcher 3 is the only game that I have, that have worse performance with Crossfire enabled 35 FPS on average vs 45 with single GPU.

No need to send me under the bridge, I'm not hater, I do not imply downgrade, or bad optimization, I'm merely interested in getting answers on those two questions. /grin
P.S. Proper name for Witcher 3 should be "Witcher 3: Ciri gone wild". :p
P.P.S. Why so serious?
Unless you are authorized to audit CDPR, I see no reason for them to give you an answer to your first question.

An answer to all of part of question 2 would be interesting. In the past AMD/ATI performance has suffered in some games because the game designers did not provide the graphical code to AMD/ATI could that be the case here?
avatar
RudyLis: Since you guys (CDPR) ignore my more "private" messages I'll write right here. ;)

1. Exactly what budget Witcher 3 had? Early rumours were about 15M USD for development and 20M for marketing, now it's 32M and 35M respectively.

2. Why you stated i7 4790k, 16Gb of RAM and GTX980 as required for ultra settings, while your game utilizes under 20% of CPU power, around 3 GB of RAM, and, though I don't have GTX980, but R9 290x, even single GPU is not loaded to stable 100%, and its power consumption is about 60% of maximum. Moreover, Witcher 3 is the only game that I have, that have worse performance with Crossfire enabled 35 FPS on average vs 45 with single GPU.

No need to send me under the bridge, I'm not hater, I do not imply downgrade, or bad optimization, I'm merely interested in getting answers on those two questions. /grin
P.S. Proper name for Witcher 3 should be "Witcher 3: Ciri gone wild". :p
P.P.S. Why so serious?
Welcome to the world of games envisioned on PC only but were dumbed down technically for console sales. It was a necessary evil in order to sell as many units as they did so we as PC owners have to live with it.

No this is not a debate on which platform is better. If i was them i would have done the same thing because the sales numbers don't lie.
avatar
Glocon: Damn, I didn't know marketing budget was so high ><
That surprises me as well. Not actually surprises, given the vast coverage and media noise Witcher 3 got, I was expecting something like that.
avatar
Glocon: I also asked myself about the CPU requirement. I were afraid but finally it's not so cpu intensive.
(Besides, do we know how much threads the games uses? I don't find this information).
Judging by my monitoring software - four.
avatar
JoeAboveAverage: Unless you are authorized to audit CDPR, I see no reason for them to give you an answer to your first question.
But of course. There were apparently official budgets of Witcher 1 and 2, both were around 5M USD, so even if lower numbers are correct, it means open world will cost at least three times more for development only. Plus marketing. I don't really blame CDPR for misplacing their funds, but I do think some money from marketing could be used to make game a little bit better. /grin Marketing may provide good sales, but it won't make any game better.
Also, if lower numbers are right, and total cost of W3 is 35M, should they receive 10$ from sold copy, they already should break even, given recent 4M copies sold news.
avatar
JoeAboveAverage: An answer to all of part of question 2 would be interesting. In the past AMD/ATI performance has suffered in some games because the game designers did not provide the graphical code to AMD/ATI could that be the case here?
That's the reason I asked this question. :) To be honest, my greatest surprise came from Crossfire drop of performance. I've seen little boost, seen various artefacts (flickering is present in Witcher 3 and quite annoying), microstuttering (also appeared after 1.05) but never seen actual reduce of FPS. So W3 is really unique game. /grin
As I don't have any Nvidia stuff around, I can't compare performance on my rig, swapping only GPU, but excluding freeze-fest Hairworks, all other settings have very little to no impact on FPS, 1-2, tops. HBAO, for example, doesn't affect FPS, if I'm to believe Fraps and Steam FPS counter.
I'm not sure whether we have GTAV-alike case, where "next-gen" consoles graphic settings were fine, but just a tad above, and performance went sideways, even if going from ultra to very high generally adds 5-10 FPS in Witcher 3. But it could be the case, where game was optimized to consoles hardware, and everything left beyond was not. Given lack of pre-release news on keyboard and mouse, and post-release rather questionable GUI and control settings, I really wouldn't be surprised. What can we do, PC Master Race no longer favourite wife of CDPR, we no longer bring enough money. :D
avatar
RudyLis: Since you guys (CDPR) ignore my more "private" messages I'll write right here. ;)

1. Exactly what budget Witcher 3 had? Early rumours were about 15M USD for development and 20M for marketing, now it's 32M and 35M respectively.
Rumors are rumors...a pity that some people really do believe everything they read, it seems...;) You admit you know your question is pure speculation & rumor--why would you expect them to take it seriously?
2. Why you stated i7 4790k, 16Gb of RAM and GTX980 as required for ultra settings, while your game utilizes under 20% of CPU power, around 3 GB of RAM, and, though I don't have GTX980, but R9 290x, even single GPU is not loaded to stable 100%, and its power consumption is about 60% of maximum. Moreover, Witcher 3 is the only game that I have, that have worse performance with Crossfire enabled 35 FPS on average vs 45 with single GPU.
Game runs buttery smooth for me @ 1920x1200 with an 2GB HD 7850 & and an AMD FX-8320E...(too bad you guys with the more expensive hardware are having such a hard time with things, if you actually are, that is)...almost forgot, that's with all settings maxed out...Guru 3d did a memory check on the game shortly after it shipped and discovered that @ 4k resolutions and maxed settings the game didn't top even 2 Gigs of vram consumption. This is a very well-optimized game.

WIth D3d12 developer support beginning to mature in the next year, expect to see a lot more of this--older, cheaper hardware is suddenly going to get "new life" because D3d12 is far, far more efficient and delivers a lot more performance-per-watt than D3d 11 & earlier. Similar changes are coming to OpenGL 5.x...
avatar
Dallaen: Welcome to the world of games envisioned on PC only but were dumbed down technically for console sales. It was a necessary evil in order to sell as many units as they did so we as PC owners have to live with it.

No this is not a debate on which platform is better. If i was them i would have done the same thing because the sales numbers don't lie.
Sales numbers do lie in case there is... how should I put it, "less honest intermediate" between you and your game. Or because you have to report to your stock holders and have to fake reports.
Debates about platforms superiority are not needed, because PC is best one - it's more versatile. In addition to games being made on PC. So it's a tool everyone use, but few people pay respect this platform deserve. At least in form of proper GUI/controls adaptation.
avatar
waltc: Rumors are rumors...a pity that some people really do believe everything they read, it seems...;) You admit you know your question is pure speculation & rumor--why would you expect them to take it seriously?
Is text somewhere around my nickname says "Commander Shepard"? No. Because I'm not one. So I don't save Galaxy here, we just talking.;p
On a serious side, I named those mentions as "rumours" because I do not remember where exactly I read those numbers. There is enough data I have to remember regarding my work (no seqweet sqweewwel, just boring statistical numbers), so I prefer not to overload my memory.
IIRC 5M USD budget of W1/2 were mentioned in several sources, including rather respectful ones (not polygon, no /grin), and W3-related numbers were derived after I think it was some Polish source, that mentioned 200M Zloty budgets, which roughly equals to 50M USD (around 54 at current rate).
So if you expect me to name particular famous Polish game journalist Grzegorz Brzeczyszczykiewicz, I have to disappoint you, I can't. ;)
avatar
waltc: Game runs buttery smooth for me @ 1920x1200 with an 2GB HD 7850 & and an AMD FX-8320E...(too bad you guys with the more expensive hardware are having such a hard time with things, if you actually are, that is)...almost forgot, that's with all settings maxed out...Guru 3d did a memory check on the game shortly after it shipped and discovered that @ 4k resolutions and maxed settings the game didn't top even 2 Gigs of vram consumption. This is a very well-optimized game.
That's what I find hilarious! And trust me, I do have problems I listed, I do not suffer (nor enjoying) from attention deficit disorder, and have no need to draw attention; rephrasing one famous movie quote, "I see fuckwit people", so I don't mind certain degree of solitude./grin. Judging by amount of free resources this game has on my rig, it feels like it's "suffocating from possibilities". :D So as much as I'd like to agree with you on statement that this game is well-optimized, I, sadly, cannot.
avatar
waltc: WIth D3d12 developer support beginning to mature in the next year, expect to see a lot more of this--older, cheaper hardware is suddenly going to get "new life" because D3d12 is far, far more efficient and delivers a lot more performance-per-watt than D3d 11 & earlier. Similar changes are coming to OpenGL 5.x...
Yeah, and we will ride in this glorious future on white unicorns, shitting banana ice cream along the way. I'm not sure hardware manufacturers suddenly approve instant loss of sales of new hardware.
avatar
waltc: WIth D3d12 developer support beginning to mature in the next year, expect to see a lot more of this--older, cheaper hardware is suddenly going to get "new life" because D3d12 is far, far more efficient and delivers a lot more performance-per-watt than D3d 11 & earlier. Similar changes are coming to OpenGL 5.x...
Not entirely true, because older hardware is not DX 12 compatible.
avatar
waltc: WIth D3d12 developer support beginning to mature in the next year, expect to see a lot more of this--older, cheaper hardware is suddenly going to get "new life" because D3d12 is far, far more efficient and delivers a lot more performance-per-watt than D3d 11 & earlier. Similar changes are coming to OpenGL 5.x...
avatar
TBreaker20: Not entirely true, because older hardware is not DX 12 compatible.
Surprisingly, it is...at least with certain features of D3d12--which are designed to enhance the performance of existing hardware--not just new stuff. Delivering far more calls, for instance, is a feature of D3d12 that already runs on older hardware, and is a huge performance improvement over same-hardware D3d11. Asynchronous CrossFire memory addressing--where two 4GB cards running together would provide 4GBs of memory under D3d11; under D3d12 they can provide 8GBs of addressable vram. Not to mention that in D3d12 those same two Crossfire/SLI cards will be recognized by the API--which is brand-new to D3d support in general. It's not so much the older hardware that is of concern--it's how well the IHVs, nVidia and AMD, will address those new capabilities in their *drivers*--and we'll see starting pretty soon, I would guess...
What you guys should consider is:

FOOD FOR THOUGHT... Why this game had such a small budget and it is 2 times bigger than GTA V + DESTINY together in ALL SENSES available.

Both together totalled on 860 million dollars. Does that make you think on something? Isn´t it enough for us to swallow so many lies from other companies?
Burn rate.
Actually a better question would be: CDPR, did you know about Nvidia gimping the 700 series cards with Witcher 3 to force customers into buying the 900 series? After all, they collaborated with Nvida when making the game.
avatar
RudyLis: Since you guys (CDPR) ignore my more "private" messages I'll write right here. ;)

1. Exactly what budget Witcher 3 had? Early rumours were about 15M USD for development and 20M for marketing, now it's 32M and 35M respectively.

2. Why you stated i7 4790k, 16Gb of RAM and GTX980 as required for ultra settings, while your game utilizes under 20% of CPU power, around 3 GB of RAM, and, though I don't have GTX980, but R9 290x, even single GPU is not loaded to stable 100%, and its power consumption is about 60% of maximum. Moreover, Witcher 3 is the only game that I have, that have worse performance with Crossfire enabled 35 FPS on average vs 45 with single GPU.

No need to send me under the bridge, I'm not hater, I do not imply downgrade, or bad optimization, I'm merely interested in getting answers on those two questions. /grin
P.S. Proper name for Witcher 3 should be "Witcher 3: Ciri gone wild". :p
P.P.S. Why so serious?
1) They already said they'd release numbers in August.

2) AMD already stated that using crossfire and AA is a bug. Turn off AA.

2a) Requirements: Game development started years ago. Like all devs, they shoot for what they think hardware will be capable of a couple years from then, and adjust over and over as the actual release time gets closer. Those specs are pretty accurate. I load all 8 "cores" of my i7 to about 30-40% on average. What a shit system it would be if they were loaded to 100%. What are you even basing these 20% of cpu, 3 GB of ram numbers on anyway? A few minutes into some benchmark? Not an accurate test. More and more assets get stored as you go, up to a point (there is garbage collection)

This is a nvidia gameworks title. AMD relies on having access to the engine's source code to do the majority of their optimizations there. It being a gameworks title does make that process more difficult. AMD's driver team has less funding to work with, and there's far less you can achieve in driver than you can in engine. It is what it is.
avatar
stevezy: Actually a better question would be: CDPR, did you know about Nvidia gimping the 700 series cards with Witcher 3 to force customers into buying the 900 series? After all, they collaborated with Nvida when making the game.
In my opinion, that's just ridiculous and stems from a lack of information. The game uses a lot of tessellation. Everywhere. Bricks and rocks jut out from the sides of buildings or walls randomly, to its use in gameworks.

The 900 series are roughly 3x better at tesselation. Of course, if you try to use tessellation with the 700 series, there will be an obvious drop in performance. It's math.
avatar
CannedPlayer: What you guys should consider is:

FOOD FOR THOUGHT... Why this game had such a small budget and it is 2 times bigger than GTA V + DESTINY together in ALL SENSES available.

Both together totalled on 860 million dollars. Does that make you think on something? Isn´t it enough for us to swallow so many lies from other companies?
The other companies have way more mouths to feed that apparently do less work per day, maybe overpaid, maybe they pay a shit ton of money to lease their office space (Rockstar's in New York City, location location location)
avatar
TBreaker20: Not entirely true, because older hardware is not DX 12 compatible.
avatar
waltc: Surprisingly, it is...at least with certain features of D3d12--which are designed to enhance the performance of existing hardware--not just new stuff. Delivering far more calls, for instance, is a feature of D3d12 that already runs on older hardware, and is a huge performance improvement over same-hardware D3d11. Asynchronous CrossFire memory addressing--where two 4GB cards running together would provide 4GBs of memory under D3d11; under D3d12 they can provide 8GBs of addressable vram. Not to mention that in D3d12 those same two Crossfire/SLI cards will be recognized by the API--which is brand-new to D3d support in general. It's not so much the older hardware that is of concern--it's how well the IHVs, nVidia and AMD, will address those new capabilities in their *drivers*--and we'll see starting pretty soon, I would guess...
With DX12, I would expect to see some really cool things. For instance, if you an AMD APU, it might be tasked to render your character only, or maybe small units on the battlefield, while the graphics card handles the rest. We're going to see some really cool stuff down the road.

I really want to get one of their new AMD APUs and build a nice little system around it.
avatar
waltc: WIth D3d12 developer support beginning to mature in the next year, expect to see a lot more of this--older, cheaper hardware is suddenly going to get "new life" because D3d12 is far, far more efficient and delivers a lot more performance-per-watt than D3d 11 & earlier. Similar changes are coming to OpenGL 5.x...
avatar
TBreaker20: Not entirely true, because older hardware is not DX 12 compatible.
Stuff that comes out today is old hehe jk. Even the GTX 580 from 2011 is DX 12 compatible, though not all features may be possible. Have to wait and see for the details.
Post edited June 16, 2015 by Clonazepam
avatar
Glocon: Damn, I didn't know marketing budget was so high ><
a good amount of TV advertisement in several countries, in addition to plenty of online advertisement... that costs a bit.
avatar
Glocon: I also asked myself about the CPU requirement. I were afraid but finally it's not so cpu intensive.
(Besides, do we know how much threads the games uses? I don't find this information).
Maybe they overspecified to be on the safe side. It's hard to test all the game situations, including battles with dozen(s) of opponents in different environments and so on.