It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
(Sorry for the late response.)

avatar
real.geizterfahr: ...
Stop for a second.

You agree that the FX-6350 doesn't bottleneck a 970 in The Division, right?

Edit (to make sure it's clear): Given that an FX-6350 gets a minimum of 73 FPS when tested with a 980 Ti, and that a 970 gets 61 FPS when tested with a Core-i7, do you agree that in a system with an FX-6350 and a 970, the GPU will be the bottleneck? (For this particular game with these particular settings.)

If you do, then I will understand the rest of the argument as you being anal retentive, and that's fine. I can be that sometimes.

If you don't, then I will try my best to explain the idea of that benchmark.
Post edited June 08, 2016 by ET3D
i'd suggest you wait for the benchmarks and buy an RX 480 AMD. At $200 you'll get very good performance, if the leaked benchmarks are anything to go by.
avatar
ET3D: Stop for a second.

You agree that the FX-6350 doesn't bottleneck a 970 in The Division, right?
Yes, I agree. The Division is pretty easy on the CPU.

avatar
ET3D: Edit (to make sure it's clear): Given that an FX-6350 gets a minimum of 73 FPS when tested with a 980 Ti, and that a 970 gets 61 FPS when tested with a Core-i7, do you agree that in a system with an FX-6350 and a 970, the GPU will be the bottleneck? (For this particular game with these particular settings.)
I agree partially. GPU and CPU are a perfect match for this game (they don't bottleneck each other).

Two additional notes here (I think #2 is what you're getting wrong when you're talking about bottlenecks):
1. The Division isn't a good example, because it's exceptionally easy on the CPU. Almost every other game out there will ask more from the CPU. Just have a look at any other benchmark. It's not the norm that a 980 Ti only loses 6.5% of it's maximum performance when it's paired with a FX 6350. Here's how much performance the top cards lose in other games (I left out Overwatch with the ridiculously overpowered GTX 1080 that loses 40%, Fallout 4 with it's AMD CPU problems and Dark Souls 3 with it's 60 fps cap):

Rise of the Tomb Raider - 8.5%
Doom - 31%
Battlefront - 20%
Batman Arkham Knight - 25%
Witcher 3 - 22%
Project Cars - 26%
GTA V - 18%
Battlefield Hardline - 12%
I stopped when I saw that Homeworld is the next, because this game is all about your CPU.

Together with The Division (the 9 most recent tests, minus the three special cases I mentioned above), that's an average of 18.77% that the 980 Ti loses when paired with a FX 6350. As I said: Battlefront is a very good example. The Division is the worst example to show that the CPU isn't a bottleneck.

2. Just because the game get's less fps with a 970 + i7 6700k than it does with a 980 Ti + FX 6350, it doesn't automatically mean that a 970 will reach the same fps with the FX 6350 than it does with the i7! That's a weird conclusion and not how CPU bottlenecks work.

Look at the Overwatch Benchmark, for example. Do you REALLY think that a 970 would deliver those 140 fps with an FX 6350 (it does with the i7), when a 1080 only manages 143 fps with the FX 6350?

Or even funnier, the Dark Souls 3 Benchmark. 970 + i7 manages 60 fps. 980 Ti + FX 6350 manages 57 fps. So... 970 + FX 6350 = 57 fps, too? Doom... 970 + i7 = 104 fps, 980 Ti + FX 6350 = 98 fps. 970 + FX 6350 = 98 fps? CPU bottlenecks don't work like that! A 970 paired with a FX 6350 will always get less FPS than a 980 Ti paired with a FX 6350.

avatar
ET3D: If you don't, then I will try my best to explain the idea of that benchmark.
This'd be a good start, because I begin to think we're talking about two entirely different things.

Again: A FX 6350 won't bottleneck a GTX 970 in The Division, because this is one of the most GPU bound games I've ever seen in a long time (in my eyes, even the 6.5% performance loss of the 980 Ti hardly qualify as a CPU bottleneck!). But the FX 6350 will start to bottleneck the 970 in almost every other game out there. Not by much, but it'll start. That's why I said that anything more powerful than a 970 would be a waste of money.
avatar
real.geizterfahr: 2. Just because the game get's less fps with a 970 + i7 6700k than it does with a 980 Ti + FX 6350, it doesn't automatically mean that a 970 will reach the same fps with the FX 6350 than it does with the i7! That's a weird conclusion and not how CPU bottlenecks work.

Look at the Overwatch Benchmark, for example. Do you REALLY think that a 970 would deliver those 140 fps with an FX 6350 (it does with the i7), when a 1080 only manages 143 fps with the FX 6350?
Yes, or close to it. Now, between brands that would be different (AMD and NVIDIA's drivers have different overheads), but for two NVIDIA cards I'd assume that the CPU overhead is similar and therefore the CPU can push the same number of frames.

If you have proof otherwise, I'd be happy to be see it.
avatar
real.geizterfahr: But the FX 6350 will start to bottleneck the 970 in almost every other game out there.
I assume you mean to qualify this by 'if you run games at 1080p'.
Post edited June 09, 2016 by ET3D
avatar
real.geizterfahr: Look at the Overwatch Benchmark, for example. Do you REALLY think that a 970 would deliver those 140 fps with an FX 6350 (it does with the i7), when a 1080 only manages 143 fps with the FX 6350?
avatar
ET3D: Yes, or close to it. Now, between brands that would be different (AMD and NVIDIA's drivers have different overheads), but for two NVIDIA cards I'd assume that the CPU overhead is similar and therefore the CPU can push the same number of frames.
Now I see where the problem lies. You definitely have a wrong idea of a CPU bottleneck. A GTX 1080 will always deliver many many more frames than a GTX 970, even if you pair them with an antique Core 2 Duo. Why? Because as soon as the CPU has done its job, the GPU will show its full performance. And a faster GPU is faster than a slower GPU.

avatar
ET3D: If you have proof otherwise, I'd be happy to be see it.
Easy. I just threw "CPU bottleneck GPU" into Google (didn't even type this - I just took this suggestion after typing "CPU bot") and I didn't have to look beyond the first result to find a decent explanation - including some example benchmarks!

Have a look at this article from PC Gamer. It's pretty good at explaining what a CPU bottleneck is. But the fun part (the benchmarks) comes on the second page. They took an i5 4670k, limited it to 2 cores @1.6GHz and ran benchmarks with two different GPUs and 12 different games. The first GPU was a GTX 460 SE, the second GPU a GTX 980. And what happened?

I'll quote just one (the most extreme) example, because it's the only example where CPU load is basically 100% (99+). Call of Duty: Advanced Warfare, had a CPU load of 99,72% and a GPU load of 93,62% when tested with the 460. It ran with an average of 34.75 fps. According to your logic, a 980 shouldn't deliver any more frames (because we hit "the CPU's frame limit"). Well... When they swapped the 460 for the 980, the game ran with 70.03 fps oO CPU load was 99.2% and GPU "load" was 29.97%. According to your idea of a CPU bottleneck, this shouldn't be possible.

Faster GPU is faster. Just not as much as it could (182.7 fps with 4 cores @3.4GHz, for completions sake).
avatar
real.geizterfahr: Have a look at this article from PC Gamer.
That article does have the problem of using average FPS instead of minimums. It also has the problem of not showing how the 460 SE scales with different CPU's, since that's pretty much what I was trying to argue. I will grant you that it's complex, and yes, you have a point, I was oversimplifying it, but I still think it's not that complex, not at the 'these CPU's have show a 5-10% difference in minimum frame rate' level.

Edit: Thinking about it some more, if we embrace this complexity, doesn't it nullify the idea that there's no point in buying a higher end card? If neither CPU nor GPU are totally bottlenecked, which will be the case if you advocate 'matching CPU and GPU', then increasing the power of either will increase the frame rate. Therefore a 980 might be a better choice than a 970 for an optimal rate, especially if the 6350 reduces the 970 below 60 fps.
Post edited June 09, 2016 by ET3D
i dont have an answer for crackpot, i just wanted to say hi and long time no see :) <3
avatar
Crewdroog: i dont have an answer for crackpot, i just wanted to say hi and long time no see :) <3
's alright, I already got more answers than I ever asked for.
avatar
Crewdroog: i dont have an answer for crackpot, i just wanted to say hi and long time no see :) <3
avatar
Crackpot.756: 's alright, I already got more answers than I ever asked for.
so this hindered more than helped? hahaha
avatar
Crewdroog: so this hindered more than helped? hahaha
Nonono, everyone was really helpful, maybe a little overeager in some cases. I know what to do now, which is to wait, which is great because I'm good at that, as it means doing nothing, basically.
avatar
Crewdroog: so this hindered more than helped? hahaha
avatar
Crackpot.756: Nonono, everyone was really helpful, maybe a little overeager in some cases. I know what to do now, which is to wait, which is great because I'm good at that, as it means doing nothing, basically.
isn't it the best?
avatar
ET3D: That article does have the problem of using average FPS instead of minimums.
Which I think is a good thing, because minimum fps doesn't say that much to me.

avatar
ET3D: It also has the problem of not showing how the 460 SE scales with different CPU's, since that's pretty much what I was trying to argue.
Since GPU load was above 90% in 9 out of those 12 games (more than 97% in 6 games!), I think it's a pretty safe thing to say that the 460 SE did all it could.

avatar
ET3D: I will grant you that it's complex, and yes, you have a point, I was oversimplifying it, but I still think it's not that complex, not at the 'these CPU's have show a 5-10% difference in minimum frame rate' level.
No, you can never say "It'll have X% less performance than it could have". I just took an average based on various CPU benchmarks I've seen (and I've seen a lot this year, since I have an old i5 2500k and need a new GPU).

Just look at the two weirdos from this article. Unreal Tournament ran with 48 fps on the 460 and with 43 on the 980 (wtf???). And Civilization V went completely nuts. With the 460 it used 82% CPU and 73% GPU. When they stuffed a 980 into the system, the game suddenly realized that it didn't use all of the CPU and started to use 93% of it. This way it managed to squeeze 2 more frames out of the CPU (does Civ V even need a GPU? ;P ).

avatar
ET3D: Edit: Thinking about it some more, if we embrace this complexity, doesn't it nullify the idea that there's no point in buying a higher end card? If neither CPU nor GPU are totally bottlenecked, which will be the case if you advocate 'matching CPU and GPU', then increasing the power of either will increase the frame rate. Therefore a 980 might be a better choice than a 970 for an optimal rate, especially if the 6350 reduces the 970 below 60 fps.
This depends on how much money you want to "waste" (and if you want to play Civilization V and similar games). In crackpot's case, you could throw $600 out of the window to get a GTX 1070 for a (seemingly) massive fps boost. Or you could get yourself a FX 9590 for $200 (Don't do it!!! Same socket as her current CPU and more powerful than the 6300, but it'll draw 220W!) and the upcoming RX 480 for $200. This'd be much cheaper than getting a GTX 1070 and it'd probably give you the same fps in most games. So... matching CPU and GPU still makes a lot of sense with those prices.

If you don't upgrade your CPU, you could "ignore" the bottleneck that you'd get with a RX 480. It'll still give you more fps than the next smaller card... But the RX 470 (or whatever it'll be called) will probably be good enough for smooth 1080p gaming with medium to high details. And if it is $50 less than the 480, this is the first $50 you can put aside for a new computer at some point in the future! Because you won't get away with just another GPU upgrade in one and a half or two years.
Post edited June 10, 2016 by real.geizterfahr
avatar
ET3D: That article does have the problem of using average FPS instead of minimums.
avatar
real.geizterfahr: Which I think is a good thing, because minimum fps doesn't say that much to me.
Why? Sure, percentiles are nicer, but averages are very misleading. You could get an average of 60 fps with serious stuttering or 60 fps that's smooth.

avatar
real.geizterfahr: If you don't upgrade your CPU, you could "ignore" the bottleneck that you'd get with a RX 480. It'll still give you more fps than the next smaller card... But the RX 470 (or whatever it'll be called) will probably be good enough for smooth 1080p gaming with medium to high details. And if it is $50 less than the 480, this is the first $50 you can put aside for a new computer at some point in the future! Because you won't get away with just another GPU upgrade in one and a half or two years.
But that has nothing to do with the CPU. Even with a Core i7, an RX 470 will give you nice 1080p gaming and you'd put aside the $50 for a future upgrade. The 6350 being more of a bottleneck doesn't change this by much. The 480 will still give better frame rates, options for higher resolutions or more effects, etc.

These $50 don't really make much difference, and if you end up upgrading the CPU later thanks to them, you'd have lost $150 because you're suddenly stuck with GPU that's a serious bottleneck and need a new one. And if you don't upgrade the CPU, still in a few years the GPU will become the major bottleneck, because GPU's advance more quickly and so games take more advantage of them than of the CPU. So either way these $50 cost you in the long run.

It's like this regardless of CPU: performance for money grows at the low end, then drops at the high end. Until the mid range paying less is always a compromise, and is not worth doing unless you're strapped for cash or have low requirements. The high end isn't worth getting unless you have lots of money and serious demands.
I did no see anyone talking about Polaris. What are the expectations regarding that chip?
avatar
real.geizterfahr: Which I think is a good thing, because minimum fps doesn't say that much to me.
avatar
ET3D: Why? Sure, percentiles are nicer, but averages are very misleading. You could get an average of 60 fps with serious stuttering or 60 fps that's smooth.
Or you could get an average of 60 fps with ocassional drops. I played GTA V with a HD 6870. Settings were low-medium, with Population Density+Variety and Distance Scaling maxed out (I hate empty streets where always the same two or three people pop up 5 meters in front of you). The game ran with 40+ fps but sometimes it had weird drops to 5-10 fps (I tried to change the settings, but the drops still came). These drops were very rare and didn't last very long. Minimum fps would say "don't even try it" in this case. But the game was perfectly playable.

But you have a point. A game that'd go from 1-100 fps every few seconds woud have an average fps of 50. Sounds good, but would be a horrible, unplayable mess. Average alone means shit, but minimum can be misleading too.

avatar
ET3D: But that has nothing to do with the CPU. Even with a Core i7, an RX 470 will give you nice 1080p gaming and you'd put aside the $50 for a future upgrade.
Why would you want to put aside some money to upgrade an i7? An i7 is an i7 and probably won't need to be upgraded for another 4 or 5 more years.

avatar
ET3D: The 6350 being more of a bottleneck doesn't change this by much.
It will. With a CPU like that one you know that you can't upgrade it easily in the future. A FX 6350 has an AM3+ socket and... well... What do you want to upgrade to with this socket? A FX 9590 that eats 220W? Hell no, don't do this! And everything else would'nt be that big of an upgrade and a huge waste of money. So you'd need a new mainboard too. And that's the point where most people just get a new PC.

avatar
ET3D: These $50 don't really make much difference, and if you end up upgrading the CPU later thanks to them, you'd have lost $150 because you're suddenly stuck with GPU that's a serious bottleneck and need a new one.
As I said: You won't upgrade anything on a AM3+ socket. Especially not crackpot, since her PSU won't be able to power one of those ridiculous 220W CPUs. The GPU upgrade she's looking for will be the last upgrade she'll ever do on her current PC (except HDD or DVD drive, maybe). A new GPU in two years would be bottlenecked too heavily by her CPU. And a CPU upgrade... Nope. So we're talking about "life-sustaining measures" here.

avatar
ET3D: And if you don't upgrade the CPU, still in a few years the GPU will become the major bottleneck, because GPU's advance more quickly and so games take more advantage of them than of the CPU. So either way these $50 cost you in the long run.
Nope, they won't cost you. A FX 6350 won't be able to run any strategy, open world or AI heavy game in two years anymore, no matter what GPU you put in your case. And your GPU will scream for an upgrade that your CPU can't keep up with. If you start to put aside $50 every month now, you'll have $1200 for a new PC then. For this money you'd almost get a pretty high end PC with an i7 4790k (what a CPU!) and a GTX 980 today (ignoring the upcoming GPUs). $460 for PSU, HDD, mainboard, case, OS, etc. is a pretty tight budget. The saved $50 could make the difference here :P Especially since not everyone can put aside $50 for a PC every month.

A PC with an i7 4790k would probably live for six or seven years from today on (with the ocassional GPU upgrade, of course), because this is one of the fastest processors out there.

avatar
ET3D: It's like this regardless of CPU: performance for money grows at the low end, then drops at the high end. Until the mid range paying less is always a compromise, and is not worth doing unless you're strapped for cash or have low requirements. The high end isn't worth getting unless you have lots of money and serious demands.
This depends... I'd say it's worth it to get a good CPU, because upgrading them is a pain in the ass due to different sockets. You can change everything else if needed, but you don't want to change your mainboard (which crackpot would have to do for a CPU upgrade). I always build PCs around the CPU. They seem to be a bit more expensive, but they live longer than "I pay $100 less for a smaller CPU"-PCs.
avatar
Gede: I did no see anyone talking about Polaris. What are the expectations regarding that chip?
Current expectation of RX 480 (biggest Polaris card) is something between a GTX 970 (in worst case) and a R9 Nano (in best case).
Vast majority says it'll be somewhere around a GTX 980.
Post edited June 14, 2016 by real.geizterfahr