It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
real.geizterfahr: [An FX6350 with]....
Total War Rome II (37 fps with GTX Titan)
Project Cars (46 fps with GTX Titan X)
While I don't agree to consider in absolute terms the FX6350 as a "slow processor", pairing it with a GTX Titan maxes no sense. That CPU is medium range , albeit with 6 cores, and will bottleneck anything from a 980 onwards. As you say rightly, the best matches for this CPU are a 970 or a R9 380.

Now, my Linux machine is an FX 6350 ( with an HD7870, a lot of ram and HDD space) . It's primarily not a gaming machine, and It's a pretty good rig. Of course it is obsolete since I build it a while ago. Of course it won't get faster. But that statement is true for any config, even a brand new I5 will be obsolete in 2 years and will never be faster than it was when purchased :-p. But it will not force win 10 down my throat

[if anyone wonders, for me, a "slow CPU" is anything that was designed to be entry level, like A4 APU's, E Cpus, Pentium G's and Celerons. I would tend to see 5 segments in this market, the FX6350 being in the 3rd, with the most I3's and the lower I5's. And the other FX would be there too, excpet for the thermal abominations, that are rather heating Appliance than CPU's.
avatar
ET3D: What do you base this assumption on?
avatar
real.geizterfahr: The fact that it already is limiting heavily in CPU intensive games!? Just have a look at some of them (average fps)...
Total War Rome II (37 fps with GTX Titan)
Skyrim (~30 fps with GTX 580)
StarCraft 2 (45 fps - 1024 x 768 - Medium Graphics, Ultra CPU Settings)
Project Cars (46 fps with GTX Titan X)
Well, doesn't that prove my theory? You show that the CPU was a limiting factor in some games at release. You don't show that modern games are significantly heavier on it. Therefore it stands to reason that it will continue to provide this level of performance, and that anyone who was okay with that level at release has absolutely no reason to upgrade.

avatar
real.geizterfahr: If you want to spend an additional $50 on a theoretically (fictional numbers) 15% faster card that in reality will only be 8% faster because of your limiting CPU -> Go for it and be happy.
No, I want to pair it with a GPU that will be 15% faster now and 50% faster in two years, so that in two years my CPU, still just as powerful for games as it is now, will be able to run games at a decent speed instead of forcing me to upgrade.
TechSpot just posted a new article which helps illustrates nicely the difference between the necessity of upgrading GPU's vs. CPU's.

the performance Rise of the Tomb Raider with several generations of high end GeForce cards. For comparison, [url=http://www.techspot.com/review/1128-rise-of-the-tomb-raider-benchmarks/page5.html]here's the performance with various CPU's.

While the oldest CPU on the CPU test is 'only' about 5.5 years old (the Core i5-2500K), none of them have a problem with the game, and I'm sure that my Phenom II X6 1090T (a 6 year old CPU) won't have a problem with it either. The 6 year old GeForce 480 on the other hand gets 17 fps at 1080p and the 580 gets 20 fps. The 4 year old 680 gets a respectable 40 fps.

So a 5.5 year old ~$200 CPU still has no problem but a $500 GPU from that time is pretty useless. There hasn't been much of a reason to upgrade CPU's for quite a few years, while GPU's need to be replaced quite often.
avatar
ET3D: You show that the CPU was a limiting factor in some games at release. You don't show that modern games are significantly heavier on it. Therefore it stands to reason that it will continue to provide this level of performance,
Sorry, but that's bullshit and you know it. I'm out of this now...
avatar
ET3D: TechSpot just posted a new article which helps illustrates nicely the difference between the necessity of upgrading GPU's vs. CPU's.
[...]
Rise of the Tomb Raider
Sorry, but that's a bad joke, isn't it? Have a look at the list of the last CPU benchmarks from Techspot again, where I posted the performance (=fps) difference between a new i7 and the old FX 6350.

avatar
real.geizterfahr: Rise of the Tomb Raider - 8.5%
Doom - 31%
Battlefront - 20%
Batman Arkham Knight - 25%
Witcher 3 - 22%
Project Cars - 26%
GTA V - 18%
Battlefield Hardline - 12%
You took the game that's easiest on the CPU and lost by far the least performance with an old CPU to prove your point that CPUs aren't important. That's the same as "proving" that the CPU is THE MOST IMPORTANT component for gaming, by taking Fallout 4 as an example, where an i7 6700 gets 113 fps, while a FX 6350 gets only 58 fps. Seriously, that's laughable -.- If you want to believe what you say: Do it. I can't help you there...

________________
edit: Funnily enough, PCGames Hardware (a German magazine about gaming with focus on hardware) updated their CPU benchmarks (they took demanding scenes in current games), so they just showed up on their main page again. The benchmark results are the average of the following games:

Anno 2205
Asssassin's Creed Syndicate
Crysis 3
Dragon Age Inquisition
F1 2015
Far Cry 4
Starcraft 2 Legacy of the Void
The Witcher 3

And these were the settings they chose for all eight games: 1.280 x 720, no AA/AF, minimal or no Post-Processing (i.e. Ambient Occlusion or Bloom)

If you chose "Gaming" from the left drop down menu (because we don't want to know the performance in "Applications") and kick out the six fastest Intel CPUs (with 6-10 physical cores and 12-20 threads), you'll get the result that the FX 6300 only delivers 52% of the performance of the i7 6700k (the top CPU with 4 cores/8 threads). And by th way: They chose to halve Starcraft 2's weight in the gaming benchmark, because it's a game that is way more CPU demanding than the other games.

But now I'm really out of this, because this really doesn't make any sense.
Post edited June 20, 2016 by real.geizterfahr
avatar
real.geizterfahr: where an i7 6700 gets 113 fps, while a FX 6350 gets only 58 fps
You understand how silly your argument sounds. You keep arguing percentages when the absolute numbers show how perfectly adequate the FX 6350 is for gaming.

You have not shown one bit of evidence that the FX 6350 has lost any of its gaming power over the years. In fact you keep referring to Starcraft 2, a game that's older than the CPU itself, as the ultimate example of its slowness. On the other hand the TechSpot article I linked to clearly showed how badly modern games run on yesteryear's GPU's.
Post edited June 20, 2016 by ET3D
avatar
ET3D: You understand how silly your argument sounds. You keep arguing percentages when the absolute numbers show how perfectly adequate the FX 6350 is for gaming.
This
is
adequate
for
gaming?

edit:
You don't get what I'm saying... Percentages are meant to show how much this CPU will hold back a high end GPU. Instead of getting a $700+ GTX 1080 that'll only deliver 50-75% of its performance (depending on the game) but still manages 60 fps, you could get an i7 and a GTX 980 Ti (that's dropping towards $400 right now) instead - or wait for the RX 480 (should manage 60 fps in most games too). This'll cost you the same (or even will be cheaper), deliver 60 fps in every game (not just in selected ones) AND the i7 won't stand in your way when you want to upgrade your GPU again in two years.

You have to look at percentages and absolute fps at the same time. Of course Far Cry 4 will be playable with 65 fps But this was only 75% of the possible performance of the GTX 980 (non-Ti!!!). In Arkham Knight you already needed a more powerful GTX 980 Ti to stay above 60 fps (68). Would you want to upgrade again, now to a 1080, to keep your 60 fps in the new Battlefield? Going from high end GPU to high end GPU, spending $500 (GTX 980) to $700 (1080) every seven or eight months, to stay above 60 fps? And this when a GTX 780 (paired with a good CPU!) would already be enough to get 60 fps out of Far Cry 4 and Arkham Knight (look at the benchmarks)?

You're taking phrases out of context, you're cherrypicking benchmarks (The Division and RotTR, the games that are the two easiest games on the CPU this year!) and you're dismissing games that are older than the CPU itself (StarCraft 2) to prove that a FX 6350 is "a perfectly adequate gaming CPU". Have fun spending your money on a new high end GPU twice a year, while other people get the same 60 fps with a good three or four years old CPU (any 3rd gen i7 or newer), an almost two years old mid range GPU (970 - 71 fps in Far Cry 4, 66 in Arkham Knight) or a more than three years old high end GPU (780 - 60 fps in Far Cry 4, 60 in Arkham Knight).
Post edited June 20, 2016 by real.geizterfahr
Yes!

You really don't get it. Someone who bought an FX-6350 did it knowing up front they'd be getting 30+ FPS, not 60+ FPS, and that they would still get 60+ FPS in some games. That was true when it was released, and it's still true now. The CPU's positioning hasn't changed, and based on that there's no reason to assume that it will change in the near future. (I'm deliberately ignoring Zen for the moment.) There has been no need to replace it to keep the same level of performance, and therefore there's no need to replace it now.

Any GPU bought, however, whatever its level, has become much more of a bottleneck over that time, and therefore requires regular upgrades, assuming the user wants to keep the same frame rate (30+ FPS with 60+ FPS in some games) with the same quality level (i.e., without having to dial down quality significantly). The choice of GPU is then based on longevity (how often one wants to upgrade) and desired visual level.

I've created a simple graph in Paint to illustrate the difference of performance over time with a high end (blue) and mid range (green) card. For the first the CPU is a serious bottleneck, and so it doesn't reach its maximum potential. Over time games become more demanding, and for a while the CPU is still the bottleneck, so the card still provides the maximum FPS the CPU allows (it might drop a little, I didn't attempt to make this accurate). After a while games will become too much for the GPU and it will start becoming the bottleneck, dropping in performance to a point where it needs to be replaced.

With the mid range card, one that's a match for the CPU when bought, it will quickly become less adequate, because it doesn't have the spare power for newer games. Over time it will reach a place where its performance is inadequate for new games, and need to be replaced. Assuming that the new card is again one matching the CPU (with the mid range now about as powerful as the high end of yesteryear), it will behave similarly.

It's possible that the single high end card is more expensive than the two mid range cards, but over time it provided better frame rates, so that extra may be worth it.
Attachments:
fps.png (3 Kb)
avatar
ET3D: ...
You still ignore the fact that you HAVE to get a current high end GPU to reach 60 fps with a weak CPU. With a four years old i7 CPU you'd only need a more than three years old high end GPU (GTX 780), or an almost two years old mid range GPU (GTX 970) to get 60 fps in ALL games. But yes, you're absolutely right. FX 6350 is perfectly adequate for selected games if you always buy the newest $500+ GPU every 6, 7 or 8 months (780, 780 Ti, 980, 980 Ti, 1080, ...). And that's exactly what
avatar
real.geizterfahr: It's pretty much a waste of money
means. Getting the newest high end GPUs to overcome your weak CPU is effin stupid. The RX 480 8 GB is $229 and will be powerful enough to run new games in maximum details for at least two years - if you don't have a FX 6350 (even the way more powerful 980 Ti can't do this anymore with that CPU). Some weird self drawn graph won't change this.

Answer what ever you want... I'm out of this now. You can't argue with someone who thinks a FX 6350 is perfectly adequate for gaming, because a 4k and VR capable, top notch high end GPU gets 60 fps out of it in 1080p in selected games.
Post edited June 22, 2016 by real.geizterfahr
avatar
real.geizterfahr: The RX 480 8 GB is $229 and will be powerful enough to run new games in maximum details for at least two years
See the flexibility of your mind? Now you're going for a $230 GPU instead of a $150 one. Great to see you coming around.
avatar
real.geizterfahr: The RX 480 8 GB is $229 and will be powerful enough to run new games in maximum details for at least two years
avatar
ET3D: See the flexibility of your mind? Now you're going for a $230 GPU instead of a $150 one. Great to see you coming around.
See what I meant with
avatar
real.geizterfahr: You're taking phrases out of context
?

The phrase you quoted goes:
avatar
real.geizterfahr: The RX 480 8 GB is $229 and will be powerful enough to run new games in maximum details for at least two years - if you don't have a FX 6350
And just to remind you that I always went for a RX 480 in my case and recommended the next smaller model in Crackpot's case:
avatar
real.geizterfahr: But that's not what Crackpot wants. She doesn't plan to change the PSU (which would be necessary for a FX-9590 since it needs 220W under load!!!), get a new $200+ CPU and a $400+ GPU. She just wants a feasible, economic GPU upgrade. The RX 480 would be a good choice, but it'll already be limited by the CPU (~15%), if AMD's 980 comparison is correct. And it'll cost around Euro 230 (estimated price for Germany, including taxes), which is still a lot of money (even if it's ridiculously cheap for such a beefy GPU). That's why I'd probably choose AMD's GPU that'll be just one step behind the RX 480. Maybe it'll still be limited by the CPU, but probably just by ~5%, which is barely noticable.

In my case, the RX 480 pretty much hits the sweet spot of what my i5 2500k can handle. Everything above that card would start to get limited by my CPU (see the Battlefront benchmark. A 980 Ti would be limited by 10% by my CPU, compared to a system with a i7 6700k)
You see? We're going in circles here. So... Whatever you answer now: I already replied to it in the next post after the one I just quoted.

ps. *click*
Post edited June 24, 2016 by real.geizterfahr
480 seems to bee what people who don't want to spend too much are aiming for, And it has power comparable to a 980? and only 150 watts of power? damn great if you ask me.
avatar
UnrealQuakie: 480 seems to bee what people who don't want to spend too much are aiming for, And it has power comparable to a 980? and only 150 watts of power? damn great if you ask me.
Theoretically it can draw up to 150W, but it'll only need ~110W (different sources leaked this number). Some people (who got cards from AMD for testing) overclocked the reference design to 1400MHz. One (LinustechTips) even gave a hint that he managed 1500MHz. The core clock of the reference design is 1266MHz, so those 110W power consumption witout overclocking do sound realistic.
avatar
real.geizterfahr: ...
You switch context then


I still think you got what I meant:

CPU's last for a long time. The FX-6300 is still in the same performance category is was a few years ago. For someone who's okay with that level there's no particular point at which it needs to be replaced. It's possible to wait for an inflection point in the market.

For a GPU, it's worth getting something somewhat more powerful to last longer. I don't really advocate getting the top of the line GPU for a low end CPU, I just advocate getting something over what you think is the current best match, on the assumption that the CPU won't be replaced.


You wouldn't have stayed with an i5-2500K if you didn't think like I did regarding CPU's. The only reason I see is that you don't see eye to eye with me is that you see the FX-6300 as not good enough, and that's fine, it's just that it's your personal opinion. What I pointed out is that the FX-6300 is about as adequate today as it was at release (when it got sub-30 FPS at Shogun 2, for example), so for someone who was willing to live with it until now there's no pressing need to move on.

And as I keep qualifying, I think that Zen could change things, if it delivers. In two years or so the low end / mid range CPU landscape could be quite a bit more powerful than it is now, and that would change what's worth getting and what game devs aim at.
So I have an older model, lower clocked i5. I doubt it's one of the super fancy chipsets. I game in 1080p and I'm content with that.

I've been considering GPU upgrades for a while now and I keep looking at the gtx 960. Is this a solid choice? I just play the stuff I get here, so for the moment highest end stuff I'd be playing would be No Man's Sky. Maybe Witcher 3 at some point?

Is this card where I should be? Is it "too much card" for my CPU? Too little card (assuming I'm going to stick with my current setup for the near / medium term I'm going to be "forcing" any new games I get here through the GPU - should I aim higher to allow me to "keep up" with new GoG releases?). Is there a better nVidia option out there in terms of price / performance?

Why are 780s so much more expensive - what am I missing? Finally, for my setup is 4gb where I should be? Or was 4gb only useful for people who were using these cards to push tons of textures, something my current rig isn't going to need to do anyway? OR because I'm going to be "force running" games with a GPU rather than relying on min / maxing my CPU GPU tandem should I certainly get the 4 gb?

This is all up in the air, just sort of spitballing hypothetical stuff here LOL.