johnnygoging: most games are single thread biased? even ones that are multicore are only wide to 4 cores?
most games like more float than more integer?
benchmarks have time and again shown piledriver to be a not insignificant hurdle in modern games.
Lin545: Ok, watch this, this is Witcher 3 multicore load:
https://youtu.be/Rutk9ErhKG4?t=3m30s Then rewind the video earlier with video sequence and you notice bulldozer single thread to underperform.
Any modern single thread locked games should really rott, because often even Intel easily bottlenecks and there is no room to distribute the load.
What kind of application you use really plays no role, it about where and how load is distributed... The bulldozer is really nehalem (i7 9xx, X56xx) but with longer pipeline to allow higher clock at price of higher energy consumption and singlethread performance.
The ECC is really good option, especially at this price. Single bitflip in filesystem driver will cause data corruption and all CPU caches and GPU memory are checksummed, only regular unbuffered RAM is not.
not sure why you're making the case so hard for Piledriver. a newer chip, with a newer platform, with a shit-ton more performance potential when overclocked, with 4 less logical cores, performs roughly the same as an older chip, on an older platform, with 4 extra logical cores, in this specific use case. at around that time, those two chips could be had about the same price.
why would anyone want to buy the 8350? why would anyone, a year later?
and while piledriver might match the i5 there, largely due to CDPRED's well executed thread model, try any Blizzard game.
application having no impact is completely false. application is the source of the majority of the variance. it entirely matters how the application is built to determine how well it will run.
and yes, the fx pipeline is long, but it isn't exceptionally long, and carries with it the high branch misprediction penalties just like the Pentium 4 did. besides that, Intel's chips since Nehalem have had a better front-end to the cores and their op-decoders lessen the time they need to work by implementing fusion of some ops before the decoding stage. AMD hadn't done this as of Piledriver, maybe because they couldn't or didn't have the time or something. and result is that two int cores share a decoder with reduced single-thread potential than Intel's.
coming away from technicals for a second, I've seen enough benchmarks to know that, depending on game engine, you're gonna go from negligibly less performance to not-insignificantly less performance with FX. you're also gonna use more juice, have no platform support of PCIe gen3, m.2, sata express, usb 3 (usually they add a third party controller).
So what I'm saying is, yes, they're good chips, I know they're good chips, but the have issues. As time goes on, those issues only get bigger. And you shouldn't get them, if you don't what you're doing.