It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Here's why you shouldn't put any faith in DX12:
1.Its created and owned by M$
2.Which means receive updates to keep it sharp every time said company releases a new OS
3.Which will mean DX will get paid updates every year as M$ will pull an annual churnout on Windows
4.This is shit and we have to shell out $$$ every year to get the latest DX and so comply with M$'s tablet clone UI fetishes.

MANTLE IS DA FUTUR!
avatar
Random_Coffee: Well, I need a better graphics card and a better power supply to run this, so I might go for the Xbox One-version. If i do decide to buy better components, will my AMD A8-5600k 3,6 Ghz run this well? I thought the GPU took most of the load, so a 8-cored AMD-processor with 4 Ghz sounds slighly overkill.
Depends if you're using the iGPU of your APU. If yes, then the chances of playing are slim. I don't know what is the equvilant for the A8's iGPU, but if you do, compare that with the GTX 660 in Anandtech bench and if there is only small difference, it'll probably run :D
CPU decodes the zipped/compressed data for textures, cutscene vids, audio etc, and is being more and more integrated to work with the GPU these days.
Post edited January 10, 2015 by Shadowstalker16
avatar
RadonGOG: Good image quality is real important for me as well, ways more important than good graphics. I´d appreciate if CDP would add an option to limit the game to 30 FPS, simply because Witcher II was play- and even enjoyable at 30 FPS (which doesn´t apply to many games at all)
That is the worst thing you can do to a game.
avatar
Shadowstalker16: Here's why you shouldn't put any faith in DX12:
1.Its created and owned by M$
2.Which means receive updates to keep it sharp every time said company releases a new OS
3.Which will mean DX will get paid updates every year as M$ will pull an annual churnout on Windows
4.This is shit and we have to shell out $$$ every year to get the latest DX and so comply with M$'s tablet clone UI fetishes.

MANTLE IS DA FUTUR!
Not sure if cereal...

Let me guess, you are into game development, right?
Post edited January 10, 2015 by Elenarie
avatar
Random_Coffee: Well, I need a better graphics card and a better power supply to run this, so I might go for the Xbox One-version. If i do decide to buy better components, will my AMD A8-5600k 3,6 Ghz run this well? I thought the GPU took most of the load, so a 8-cored AMD-processor with 4 Ghz sounds slighly overkill.
avatar
Shadowstalker16: Depends if you're using the iGPU of your APU. If yes, then the chances of playing are slim. I don't know what is the equvilant for the A8's iGPU, but if you do, compare that with the GTX 660 in Anandtech bench and if there is only small difference, it'll probably run :D
CPU decodes the zipped/compressed data for textures, cutscene vids, audio etc, and is being more and more integrated to work with the GPU these days.
I am not using the iGPU of the APU, I am using a Radeon 7850 with 1 GB of VRAM.

I didn't know that the CPU did that much of the work in games. Well, I will probably have to replace the CPU as well then, to run The Witcher 3 at good settings. I'm sure the Xbox might have slightly better specs than my PC, so the Xbox-version seems to be my best bet.
avatar
RadonGOG: Good image quality is real important for me as well, ways more important than good graphics. I´d appreciate if CDP would add an option to limit the game to 30 FPS, simply because Witcher II was play- and even enjoyable at 30 FPS (which doesn´t apply to many games at all)
avatar
Elenarie: That is the worst thing you can do to a game.
Could you please explain why? It´s ways better than having a VSYNced Framerate that bounces between 31-40. Of course their is a tech that wipes away this problem (AdaptiveSync), but nearly nobody owns a GSYNC-Monitor right now! (and totally nobody owns a FreeSync-Monitor, they´ve just got announced!)

Remember: I´m not talking about setting that limiter on for everybody, I just want to have a perfect compatible In-Menu OPTION for this!
avatar
Shadowstalker16: Depends if you're using the iGPU of your APU. If yes, then the chances of playing are slim. I don't know what is the equvilant for the A8's iGPU, but if you do, compare that with the GTX 660 in Anandtech bench and if there is only small difference, it'll probably run :D
CPU decodes the zipped/compressed data for textures, cutscene vids, audio etc, and is being more and more integrated to work with the GPU these days.
avatar
Random_Coffee: I am not using the iGPU of the APU, I am using a Radeon 7850 with 1 GB of VRAM.

I didn't know that the CPU did that much of the work in games. Well, I will probably have to replace the CPU as well then, to run The Witcher 3 at good settings. I'm sure the Xbox might have slightly better specs than my PC, so the Xbox-version seems to be my best bet.
Do you already own a XBONE? If not it wouldn´t be wise to buy one if you can just buy some stronger Hardware for your PC as an alternative!
Post edited January 10, 2015 by RadonGOG
avatar
RadonGOG: Could you please explain why? It´s ways better than having a VSYNced Framerate that bounces between 31-40. Of course their is a tech that wipes away this problem (AdaptiveSync), but nearly nobody owns a GSYNC-Monitor right now! (and totally nobody owns a FreeSync-Monitor, they´ve just got announced!)

Remember: I´m not talking about setting that limiter on for everybody, I just want to have a perfect compatible In-Menu OPTION for this!
Each frame missed is a bump in irresponsiveness and lagginess. It is a game, not a movie. You want the least the game to not be responsive.

Everything else is just eye candy.
What I always wonder with games like Witcher 3 that are supposed to get a Linux version why are they using Directx. Wouldn't OpenGl be a better option since its made for crossplatform development whereas Directx is Win only thing and then there wouldn't be a need for porting it. Im just a layperson in this so bear with me.
Post edited January 10, 2015 by Matruchus
avatar
Matruchus: What I always wonder with games like Witcher 3 that are supposed to get a Linux version why are they using Directx. Wouldn't OpenGl be a better option since its made for crossplatform development whereas Directx is Win only thing and then there wouldn't be a need for porting it. Im just a layperson in this so bear with me.
Who told you that TW3 is getting a Linux version? Because it appeared for a brief time on a banner on the game's pre-order launch?

It has never been confirmed so don't get your hopes up. I shudder to think what requirements this game will have on Linux if a wrapper job is ever attempted.
avatar
Matruchus: What I always wonder with games like Witcher 3 that are supposed to get a Linux version why are they using Directx. Wouldn't OpenGl be a better option since its made for crossplatform development whereas Directx is Win only thing and then there wouldn't be a need for porting it. Im just a layperson in this so bear with me.
avatar
silviucc: Who told you that TW3 is getting a Linux version? Because it appeared for a brief time on a banner on the game's pre-order launch?

It has never been confirmed so don't get your hopes up. I shudder to think what requirements this game will have on Linux if a wrapper job is ever attempted.
My, bad. Saw the announcement months ago on gamingonlinux. I rechecked the site again now and you are right.
It will be released for Windows only so there won't be a version for Mac either. I've been following some devs that gave their games to Aspyr and Feral for their ports and they're doing a good job, maybe CDPR should follow? Can't say I like the way Witcher 2 was handled, may have been because it was their first attempt to Linux so there may be hope yet.

I would very much like a native Witcher 3 on Linux.
avatar
Shadowstalker16: Depends if you're using the iGPU of your APU. If yes, then the chances of playing are slim. I don't know what is the equvilant for the A8's iGPU, but if you do, compare that with the GTX 660 in Anandtech bench and if there is only small difference, it'll probably run :D
CPU decodes the zipped/compressed data for textures, cutscene vids, audio etc, and is being more and more integrated to work with the GPU these days.
avatar
Random_Coffee: I am not using the iGPU of the APU, I am using a Radeon 7850 with 1 GB of VRAM.

I didn't know that the CPU did that much of the work in games. Well, I will probably have to replace the CPU as well then, to run The Witcher 3 at good settings. I'm sure the Xbox might have slightly better specs than my PC, so the Xbox-version seems to be my best bet.
Lookie hea:http://www.anandtech.com/bench/product/1076?vs=1039 Your HD 7850 and the GTX 660 are quite close in performance. I think you'd be able to play with that GPU. I have the R7 260X which is slightly weaker than the 7850 and I'm confident I can run it at 768p medium.
Your CPU: http://www.anandtech.com/bench/product/676?vs=698 also seems quite alright, if only lacking in the multi-core performance department. It is more powerful than the min recommended: http://www.anandtech.com/bench/product/80?vs=676 but the Intel comparison throws up more confusion: http://www.anandtech.com/bench/product/288?vs=676
Assuming you trust CDPR enough to believe the Phenom is the minimum CPU, I'd say you have little to worry about. But before anything detailed, what resolution are you playing on?
avatar
RadonGOG: Good image quality is real important for me as well, ways more important than good graphics. I´d appreciate if CDP would add an option to limit the game to 30 FPS, simply because Witcher II was play- and even enjoyable at 30 FPS (which doesn´t apply to many games at all)
avatar
Elenarie: That is the worst thing you can do to a game.
avatar
Shadowstalker16: Here's why you shouldn't put any faith in DX12:
1.Its created and owned by M$
2.Which means receive updates to keep it sharp every time said company releases a new OS
3.Which will mean DX will get paid updates every year as M$ will pull an annual churnout on Windows
4.This is shit and we have to shell out $$$ every year to get the latest DX and so comply with M$'s tablet clone UI fetishes.

MANTLE IS DA FUTUR!
avatar
Elenarie: Not sure if cereal...

Let me guess, you are into game development, right?
More into game critique :D
Post edited January 10, 2015 by Shadowstalker16
avatar
stg83: So if there is one game in my opinion that really justifies upgrading your PC for, then it is this one and besides if you do meet or exceed the required specs I am positive that you would have no problems with any big games for atleast another 5 years. It is an investment for the future IMO but that is ofcourse for the people who can spare that kind of money, otherwise folks can just wait a few years until the hardware and the game is much more affordable.
I don't disagree with you but it is something that devs should think about because high sys reqs does exclude certain people from playing and enjoying a game but I do agree that it is necessary to push technology forward but only if it is worth it. Shadow of Mordor also had high sys req and I don't think that game was worth it in the end (though a good game). Crysis on the other hand was worth it and yes, I also think TW3 is gonna be worth it but it is still unfortunate that some people will not get to experience it for years to come.

I should add that I don't think my rig (I5 3500k, 8 GB RAM, GTX 760 OC) will have any issues running it but I'm more sad on behalf of those that can't afford a high-end gaming rig.
avatar
Shadowstalker16: But before anything detailed, what resolution are you playing on?
I'm playing on 1680x1050.
avatar
Shadowstalker16: But before anything detailed, what resolution are you playing on?
avatar
Random_Coffee: I'm playing on 1680x1050.
Then I think the game should run just as well as I'm expecting it to on mine at 768p: around 40~fps at medium-low.Hard to tell when the CPU and GPU recommendations are so wildly different for AMD and Intel+Nvidia. Those specs are probably for 1080p, so don't forget, and you have a setup that almost meets it, so I bet it'll run fine on lower resolutions. If you've OCd anything, you can expect even better.
Post edited January 10, 2015 by Shadowstalker16
avatar
jepsen1977: *snip*
I certainly agree on the point that devs should think about optimizing the game for a wide variety of configurations from low to high end PC. Indeed, games like The Evil Within and Shadow of Mordor requirements on PC were definitely ridiculous along with a lot of other recent titles pretending to be next gen without any visible justification.
Post edited January 10, 2015 by stg83
avatar
jepsen1977: Lots of people will not be able to play this game and not all can afford to upgrade their PC.
avatar
stg83: One of the main reasons why consoles are still so popular and where many people prefer to play their games. :)

I see many people saying that they don't want to upgrade their PC for just one game. But this is inevitable as the medium of video games continues to progress while getting bigger and better (arguably) in scope as well as ambition. The Witcher 3 is certainly one of those games that strives to be a benchmark for the new generation of Action/RPGs and no I am not going to count Dragon Age: Inquisition in that regard. Only time will tell whether The Witcher 3 will live up to the hype and expectations but judging by the pedigree of CDPR along with the time they are taking to make it the best possible experience, I am fairly excited as well as optimistic about it.

So if there is one game in my opinion that really justifies upgrading your PC for, then it is this one and besides if you do meet or exceed the required specs I am positive that you would have no problems with any big games for atleast another 5 years. It is an investment for the future IMO but that is ofcourse for the people who can spare that kind of money, otherwise folks can just wait a few years until the hardware and the game is much more affordable.
Indeed, many people are saying they wont upgrade their PC for one game. That's because everyone is an individual with a their own unique financial situation and set of priorities in life including regarding PC upgrades and against that is weighted how strong their desire is for any particular entertainment product. Upgrading a PC is indeed inevitable if one wants to have a functional computer over time because electro-mechanical wear and tear and/or electronic failure will eventually render peripherals or an entire system inoperable and if one wants to continue having a functional system for any purpose (gaming, web browsing, etc.) then one will eventually have to upgrade or otherwise get new components (gifts, hand-me-downs, whatever). That definitely is inevitable.

Also, as new games come out over time many will push the limits of the technology and thus demand more capable hardware and those who wish to play these bleeding edge games will have to update their hardware if their desire to play the game(s) and their ability to pay for/obtain new hardware exceed their contentment to live on with what they have at the moment.

Everyone has their own situation and their own prerogative however and naturally people will individually choose what works best for themselves and they're "right" insofar as they themselves are concerned with their own situation.

I myself used a 2.8GHz P4 Northwood and other similar systems (I have a bunch here) for over 10 years. Some of those systems were from 2002, some from 2004, and over time I got a few hard disk upgrades, some hand me down hardware and whatnot. It wasn't that I couldn't afford new hardware, it is that I am a frugal person that prefers to make do with what I have at the moment either until it completely breaks down, or until it no longer sufficiently meets my needs and a bar line is crossed that very strongly encourages me to spend new money. As an unintentional side benefit it is also less harmful to the environment throwing a new PC into landfill every 12-18 months. ;)

Feb 2013 I spec'd out a new PC and built a reasonably priced reasonably screaming PC for gaming and other things. I wasn't intending to win any "my PC is better than yours" contests or anything like that, nor to brake any world records for highest amount spent on a PC gaming rig, but rather being practical for my personal needs of gaming, development, various computer engineering projects and a number of virtual machines etc. What I am usually looking for in a system is the best price versus performance point on each component in the system - the peak of the bell curve if you will. From there, I may upgrade or downgrade a component depending on how important it is to me in the overall system for my perceived needs. I built this PC fully intending that it will be running still 10 years after it was built and still serving some purpose on my LAN even if I manage to end up building a new primary desktop before 10 years time (probably).

At the time I built the system, the AMD/ATI Radeon 7850 was the most highly recommended card on just about every GPU review type website out there in the price to performance a.k.a. "best value for the buck" category and was in my budget of $200 or less for a GPU, so that's what I bought. The card retained that title until 3 months ago or so (I'd have to hunt it down in Steam stats again to get the exact timeframe but I'm close enough). I'm incredibly happy with the system I built and find it is still incredibly powerful for anything I throw at it. Out of about 700-800 games, for those I've actually played on it I have only encountered about 3 games that are sluggish with all 3D effects enabled at the highest resolution (My monitor is 2560x1600). For those games that are a little sluggish, I either drop the res down to 1920x1200 which is about half the pixels, and/or I drop off some of the 3D effects that make the least visual difference and toy around until I get something acceptable. To date I've successfully got every game, new and old to work very well this way without having to go super low res or disable all the effects.

I've literally got thousands if not tens of thousands of hours of gameplay in my existing game collection and all of the games I own currently will or I at least perceive they will run smoothly as per above. Many of these games such as Skyrim, Mount & Blade are highly addictive and have lots of replay value. I could literally play the games I own right now for 10 years without ever buying or obtaining another game and most likely have a blast without ever playing a new "next gen" game - although that doesn't mean I don't want to play new games either - of course I do. :)

So it is inevitable that a day will come that a game wont run on my existing hardware yes, and it is inevitable that some day I will have to upgrade components or build a new PC. Everyone decides how and when they will do that based on their own personal needs/desires and motivations of course and people differ greatly in all of those regards (and many others) so it stands to reason that people will decide to upgrade or not upgrade their PC with vary differing and perfectly reasonable rationale for their own situation.

A new game coming out, no matter how exciting it appears to be for me - is not my highest priority in life personally however, and not a factor that one game alone will convince me to spend hundreds of dollars on new hardware. That doesn't mean that one game wont make me do that either, but there are no rules at all here other than what the individual decides for themselves really even if it makes no sense to someone else.

I personally plan on buying a new video card when this one either goes completely dead on me, or shows some sign of failure such as bad VRAM, a dead GPU fan that can't be replaced or similar, or some rather long period of time such as 5+ years or whatever passes and a new $200 GPU is out that is like 16 times faster or something and incredibly worth it to me in the price versus value category in combination with there being a stockpile of games or other software that I've accumulated on a wishlist over time that my existing PC was unable to handle.

Do I want to play The Witcher 3? Bigtime! Enough to spend $300-1000 or more on video hardware to push a 30" display at 2560x1600? Not a chance! I'm not upset about it, or angry in any way. I'm disappointed but that's natural for anyone anticipating the game in this situation really, but it's also expected to happen at some point with some game. I'd just as soon it was a game coming out in another 2 years or something but if it's TW3 then it is what it is. No biggie really. :)

In the mean time I'll be playing Skyrim, Mount & Blade series, The Witcher 1 and 2, and a multitude of other awesome games on my 2 year old "archaic" hardware and enjoying the crap out of them, and waiting out the long haul to get to play TW3 myself. To be honest, I have incredible faith and trust in CDPR and while the video requirements are disappointing, if that's what they are and it needs to be that way then this game is most likely even more insane awesome than previously considered and ultimately that's a good thing. I'll just have to wait longer to play it, but that's ok too. Those with $600 video cards will get to enjoy the game to the fullest and poke fun at those of us with 2 year old computers playing Skyrim. ;oP

Having said that, if there is a demo of the game released or some other way of testing it out on my hardware, or if I can find in forums here or elsewhere someone playign the game on a 7850 with an identical CPU etc. setup and with acceptable (to me personally) resolution etc. then I'll be first in line to rush out and buy it right away anyway.

On a final note... an investment in the future is leaving my money in my investments to magnify over the next 5 years than spending it on hardware that will be obsolete in another 2 years and need to be replaced again at that time. :) Different strokes and all that jazz... ;)

</book> ;o)