It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Lin545: Basically...

This is the in-game CPU ratings (table below).

This is the in-game GPU rating (table below).

Those are golden.

There are a lot of guys on internets suggesting people to pay 200-300$ and get 10% more fps....

Currently for new - you can get FX 6350 from AMD; or anything from Sandybridge & up (Intel), although it costs more.

Or used path: overclock your existing CPU - Phenom II x4 is pretty good; or get Westmere-based intel CPU (i7 9xx/Xeon 56xx - see wikipedia) - those are selling for $50-80 (just few years ago @ $1000) and overclock it to match current CPU performance.

Same applies to GPUs, some of them are relatively old, but still very powerful.

I would say in your config, the GPU is the bottleneck.
avatar
SimonZephyr: Thanks but your links are Russian. I didn't find any English option, so I would say that I'm pretty out of luck on knowing what to pick. I think I'll wait for a couple of year for the electronics to come down in price, then I'll let my patience return on its investments.
Just drink some vodka and you will understand everything or play some hours with randoms in cs:go or dota2 :P
avatar
SimonZephyr: Hi guys,

I'm a noob at upgrading my PC. When my first (my current PC 2010) gaming pc was built, its components were picked by a store clerk at Canada Computers and built by a firend of mine. Its specs from Performance Information and Tools is attached as a dxdiag

If not how much Canadian dollars and what would I have to buy to run the game at max setting at 60 fps 4K resolution?
I watched a video about summer last year of a guy who had a current generation high end PC with 4 nVidia 980Ti cards in SLI with a 5k display and he was unable to get 30FPS in The Witcher 3 with it. That's well over $6000 of hardware. He did state that the games encountered a performance drop at 5k compared to 4k although I can't remember exactly what the reason was. So the game would be faster on 4k than 5k by more than the pixel difference, but still would likely be tough to get consistent 60FPS out of it with that many pixels. Basically you need to figure out what CPU+GPU are required to get 60FPS out of 1920x1080, measure CPU/GPU usage and then quadruple that. The game is more GPU heavy than CPU, so money is better put into the GPU.

I've got an AMD FX8350 with a Radeon HD7850. I get about 26-30 FPS with various settings tweaked to give the best performance. My GPU is slightly under-spec but regardless the CPU usage during the busiest parts of the game is only 15% and that includes the dozens of background tasks running. The GPU is the clear bottleneck. If I were to upgrade the GPU then most likely the CPU usage would go up as well as the GPU would do more in a given time frame and need more data fed to it, but the CPU should be more than adequate as it is the recommended CPU for this game.

I'd say that to get 60FPS at 4k for this game you'd need to buy at least a new $200-300 CPU and a new motherboard and RAM to match it, possibly a new PSU as well, and you'd need to buy several high end current generation GPUs, with nVIdia seeming to have an apparent lead. I'd expect to have to buy 2-4 $700-900 GPUs to push that many pixels consistently at 60FPS unless someone out there has an actual demo system out there on youtube or somewhere getting 60FPS on a lesser system.

I myself have a 2560x1600 display, which has slightly more than twice the pixels of full-HD at 1920x1080, and if I run it at native resolution I end up with around 20FPS unless I further lower certain quality settings down. 4k displays are double the number of pixels to what I have right now, so I imagine I'd get around 5-10FPS with a 4k display.

I have also tried the game in 1920x1200 x3 using eyefinity however too, and I end up with around 10-12FPS experimentally with that. That is just slightly more than 3/4 the pixels of a 4k display in terms of pounding the crap out of the GPU.

60FPS gaming at 4k sure isn't going to be cheap though, that's for sure. Nor is VR gaming at 90FPS which is coming this summer... :)
avatar
Shadowstalker16: I stopped at 60fps@4K.

You need 4 GTX 980s and an i7 that won't bottleneck them.
This! 60fps with max settings in Witcher 3 @4k resolution? Good luck with that one... I'd wait two more years, until "normal" high-end hardware (=one very expensive graphics card, instead of two or more extremely expensive cards) will be able to do this. You could buy a used car with the money you'd need to build this computer O.O
avatar
SimonZephyr: Thanks but your links are Russian. I didn't find any English option, so I would say that I'm pretty out of luck on knowing what to pick. I think I'll wait for a couple of year for the electronics to come down in price, then I'll let my patience return on its investments.
I pointed at the table which is at the bottom of the article. The table is in english.

The site is official branch of Tomshardwareguide in RU.

Besides, discover free translation services - like google translate or bing - but only if you want to read the article, and not the table.
Post edited February 15, 2016 by Lin545
avatar
SimonZephyr: Hi guys,

I'm a noob at upgrading my PC. When my first (my current PC 2010) gaming pc was built, its components were picked by a store clerk at Canada Computers and built by a firend of mine. Its specs from Performance Information and Tools is attached as a dxdiag

If not how much Canadian dollars and what would I have to buy to run the game at max setting at 60 fps 4K resolution?
avatar
skeletonbow: I watched a video about summer last year of a guy who had a current generation high end PC with 4 nVidia 980Ti cards in SLI with a 5k display and he was unable to get 30FPS in The Witcher 3 with it. That's well over $6000 of hardware. He did state that the games encountered a performance drop at 5k compared to 4k although I can't remember exactly what the reason was. So the game would be faster on 4k than 5k by more than the pixel difference, but still would likely be tough to get consistent 60FPS out of it with that many pixels. Basically you need to figure out what CPU+GPU are required to get 60FPS out of 1920x1080, measure CPU/GPU usage and then quadruple that. The game is more GPU heavy than CPU, so money is better put into the GPU.

I've got an AMD FX8350 with a Radeon HD7850. I get about 26-30 FPS with various settings tweaked to give the best performance. My GPU is slightly under-spec but regardless the CPU usage during the busiest parts of the game is only 15% and that includes the dozens of background tasks running. The GPU is the clear bottleneck. If I were to upgrade the GPU then most likely the CPU usage would go up as well as the GPU would do more in a given time frame and need more data fed to it, but the CPU should be more than adequate as it is the recommended CPU for this game.

I'd say that to get 60FPS at 4k for this game you'd need to buy at least a new $200-300 CPU and a new motherboard and RAM to match it, possibly a new PSU as well, and you'd need to buy several high end current generation GPUs, with nVIdia seeming to have an apparent lead. I'd expect to have to buy 2-4 $700-900 GPUs to push that many pixels consistently at 60FPS unless someone out there has an actual demo system out there on youtube or somewhere getting 60FPS on a lesser system.

I myself have a 2560x1600 display, which has slightly more than twice the pixels of full-HD at 1920x1080, and if I run it at native resolution I end up with around 20FPS unless I further lower certain quality settings down. 4k displays are double the number of pixels to what I have right now, so I imagine I'd get around 5-10FPS with a 4k display.

I have also tried the game in 1920x1200 x3 using eyefinity however too, and I end up with around 10-12FPS experimentally with that. That is just slightly more than 3/4 the pixels of a 4k display in terms of pounding the crap out of the GPU.

60FPS gaming at 4k sure isn't going to be cheap though, that's for sure. Nor is VR gaming at 90FPS which is coming this summer... :)
Gaming is expensive :(
Does the nVidia cards even work on the gigabyte motherboard? Does not the motherboards discriminate against other brands or something?
avatar
skeletonbow: I watched a video about summer last year of a guy who had a current generation high end PC with 4 nVidia 980Ti cards in SLI with a 5k display and he was unable to get 30FPS in The Witcher 3 with it. That's well over $6000 of hardware. He did state that the games encountered a performance drop at 5k compared to 4k although I can't remember exactly what the reason was. So the game would be faster on 4k than 5k by more than the pixel difference, but still would likely be tough to get consistent 60FPS out of it with that many pixels. Basically you need to figure out what CPU+GPU are required to get 60FPS out of 1920x1080, measure CPU/GPU usage and then quadruple that. The game is more GPU heavy than CPU, so money is better put into the GPU.

I've got an AMD FX8350 with a Radeon HD7850. I get about 26-30 FPS with various settings tweaked to give the best performance. My GPU is slightly under-spec but regardless the CPU usage during the busiest parts of the game is only 15% and that includes the dozens of background tasks running. The GPU is the clear bottleneck. If I were to upgrade the GPU then most likely the CPU usage would go up as well as the GPU would do more in a given time frame and need more data fed to it, but the CPU should be more than adequate as it is the recommended CPU for this game.

I'd say that to get 60FPS at 4k for this game you'd need to buy at least a new $200-300 CPU and a new motherboard and RAM to match it, possibly a new PSU as well, and you'd need to buy several high end current generation GPUs, with nVIdia seeming to have an apparent lead. I'd expect to have to buy 2-4 $700-900 GPUs to push that many pixels consistently at 60FPS unless someone out there has an actual demo system out there on youtube or somewhere getting 60FPS on a lesser system.

I myself have a 2560x1600 display, which has slightly more than twice the pixels of full-HD at 1920x1080, and if I run it at native resolution I end up with around 20FPS unless I further lower certain quality settings down. 4k displays are double the number of pixels to what I have right now, so I imagine I'd get around 5-10FPS with a 4k display.

I have also tried the game in 1920x1200 x3 using eyefinity however too, and I end up with around 10-12FPS experimentally with that. That is just slightly more than 3/4 the pixels of a 4k display in terms of pounding the crap out of the GPU.

60FPS gaming at 4k sure isn't going to be cheap though, that's for sure. Nor is VR gaming at 90FPS which is coming this summer... :)
avatar
SimonZephyr: Gaming is expensive :(
Does the nVidia cards even work on the gigabyte motherboard? Does not the motherboards discriminate against other brands or something?
All GPUs attach and work with all boards with PCleX16(exact name?, rusty) slots. No, all GPUs are just reference model from AMD / Nvidia factory shipped out to Sapphire or Zotac or other brands who put their heatsink and fan on them and maybe slightly overclock them. Similarly mobos are assembled from OEM parts from other electronics makers. The branding is just on the finished product; the parts constituting it may be from 5 different manufacturers or more.
Just don't confuse brands with sockets types on mobos. Intel sockets don't fit AMD and vice versa. And all intel sockets don't fit all intel processors, as the same with AMD.
avatar
skeletonbow: He did state that the games encountered a performance drop at 5k compared to 4k although I can't remember exactly what the reason was.
Well, 4k is about 8.3 million pixels, and 5k is about 14.7 million, so you have to push almost twice as much data per frame for 5k. That's almost surely why.

EDIT: Incidentally, not that you ask or even care, 30fps at 5k resolution and 32-bit color depth means you have to be pushing data at the rather impressive rate of close to 20 gigabits per second to your monitor. It's no wonder they're expensive to buy, and require robust hardware.
Post edited February 15, 2016 by OneFiercePuppy
avatar
SimonZephyr: Gaming is expensive :(
True but your standards are also high, believe it or not the game is still enjoyable at 30fps with a 1080p resolution.
You have a very nice computer despite its age. But Witcher 3 max settings 4K at 60fps is asking a lot - you won't attain that unless you spend an absolute fortune on the high end of current technology.

Your computer will probably struggle with Witcher 3 as it is, but following vsr's advice on upgrades should get you a good experience for not too much money. Of course, since you've had a good run with your computer already, it might be worth holding back on an upgrade and putting your money toward a whole new computer instead. Both are reasonable options.

You might have to give up the idea of having 4K, 60fps and max settings all at the same time though. That's quad-SLI territory.
avatar
SimonZephyr: Does the nVidia cards even work on the gigabyte motherboard?
Yes. Most graphics cards require a single PCIe (16x) slot, which most motherboards provide. The problem comes in if you're trying to put multiple graphics cards in the same motherboard. In that case, you'll need a motherboard with "Crossfire" support if you want multiple AMD cards, or "SLI" support for nVidia cards.

The main concern is the power supply. Does it supply enough power for your whole computer? (bearing in mind that some graphics cards are serious power hogs). And does the power supply unit have enough connectors to plug into your card?

avatar
SimonZephyr: Does not the motherboards discriminate against other brands or something?
Thankfully they do not. You can even put an AMD card into a motherboard manufactured by nVidia and expect them to behave.
Post edited February 15, 2016 by Barefoot_Monkey
avatar
OneFiercePuppy: Well, 4k is about 8.3 million pixels, and 5k is about 14.7 million, so you have to push almost twice as much data per frame for 5k. That's almost surely why.

EDIT: Incidentally, not that you ask or even care, 30fps at 5k resolution and 32-bit color depth means you have to be pushing data at the rather impressive rate of close to 20 gigabits per second to your monitor. It's no wonder they're expensive to buy, and require robust hardware.
You know, I never really did the math before on that but I just did to confirm it and indeed you're correct on the pixel count. I of all people should have realized that or did the math before but for some reason it didn't seem like that big a jump. The 2-dimensions is where the numbers get ya. :) That indeed almost doubles the GPU capabilities and puts other burdens on the system. It was kind of disappointing to see on the guy's demo system though because it's always sweet to see new hardware especially on the high end, but sad to see it perform so terribly on such an expensive amount of gear.

I'm pretty happy with my 2.5k display here though, and can't imagine myself buying a 4k display any time soon. I'd potentially like to get one some day, but they do not make 16:10 aspect ratio displays at 4k presently and it isn't clear if anyone will in the future and I'm kinda partial to 16:10 so I think I'll ride it out on the current display. :) It's better than what most people out there are using, and games are usually optimized for FHD anyway. The GPU requirements are roughly double that of FHD, and half that of 4k, but FHD still looks great on this display anyway plus I keep my 16:10.

5k and beyond are going to be tough sells for people IMHO, and most games wont be playable on them due to undersized non-scalable HUDs, tiny fonts and impossibly small mouse cursors that don't scale to the high-DPI which is really problematic for many games. I'd kill to get to see The Witcher 3 on a 4k though, that'd be impressive as hell. :)
avatar
skeletonbow: they do not make 16:10 aspect ratio displays at 4k presently
You can still get them, though they're uncommon and expensive.
avatar
skeletonbow: I'd kill to get to see The Witcher 3 on a 4k though, that'd be impressive as hell. :)
It's beautiful ^_^ I upgraded to a 4k monitor and a titan X video card shortly after Witcher 3 came out and the game is positively stunning at 4k with everything at max. It looks almost like its own sequel, compared to the console version.
avatar
skeletonbow: they do not make 16:10 aspect ratio displays at 4k presently
avatar
OneFiercePuppy: You can still get them, though they're uncommon and expensive.
Technically the IBM T221 is a 4k display that is 16:10, but they are extremely old technology (15 years old) which is far behind the current mainstream 4k displays coming out. They are ancient high-end specialty hardware and I'd specifically exclude them from modern discussion of 4k displays due to this. Almost nobody would find them suitable for what they'd expect to be getting in a modern 4k display, not to mention they are really small too.

There are no modern 4k displays I'm aware of that are 16:10 though, but I hope that the professional market demands them and they actually happen as it'd be very disappointing to lose vertical real estate just to gain higher resolution. Fortunately it's not a decision I'll have to make for many years to come though so there's a lot of breathing room left. :)


avatar
skeletonbow: I'd kill to get to see The Witcher 3 on a 4k though, that'd be impressive as hell. :)
avatar
OneFiercePuppy: It's beautiful ^_^ I upgraded to a 4k monitor and a titan X video card shortly after Witcher 3 came out and the game is positively stunning at 4k with everything at max. It looks almost like its own sequel, compared to the console version.
Awesome, what frame rate do you get, and what specific card did you buy? I saw a video of The Witcher 3 at 4k resolution and it was impressive, but like all video it is compressed and the detail is not anywhere near what it would be on the live 4k display. Also, do you play with hairworks enabled or disabled?
All this W3 talking makes me wonder what hardware we will nedd to run Cyberpunk 2077 in 1920X1080 with 30 fps.
avatar
skeletonbow: Awesome, what frame rate do you get, and what specific card did you buy? I saw a video of The Witcher 3 at 4k resolution and it was impressive, but like all video it is compressed and the detail is not anywhere near what it would be on the live 4k display. Also, do you play with hairworks enabled or disabled?
>.> http://www.gog.com/forum/general/is_my_pc_upgradeable_to_play_witcher_3_at_max_setting/post8

EDIT: Yeah, fair point on the IBM monitor. I knew it wasn't an ideal gaming monitor, but when looking up the detailed specifications, it became obvious that it isn't even an *acceptable* gaming monitor. Live and learn. =)
Post edited February 15, 2016 by OneFiercePuppy
avatar
OneFiercePuppy: >.> http://www.gog.com/forum/general/is_my_pc_upgradeable_to_play_witcher_3_at_max_setting/post8

EDIT: Yeah, fair point on the IBM monitor. I knew it wasn't an ideal gaming monitor, but when looking up the detailed specifications, it became obvious that it isn't even an *acceptable* gaming monitor. Live and learn. =)
Hehehe, yeah it wasn't acceptable 2700 years go when it came out. LOL If I'm not mistaken those monitors were made for medical imaging and stuff like that. Tumours could probably sprout and metastasize in-between screen refreshes. :)
avatar
Gonen32: All this W3 talking makes me wonder what hardware we will nedd to run Cyberpunk 2077 on the Oculus Rift in 2160x1200 with 90 fps.
There, fixed it for ya. :)
Post edited February 16, 2016 by skeletonbow