It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
StingingVelvet: I'm sure it was overkill but I just wanted to be sure. That's probably why I was like "IT'S NOT POWERRRRR" lol, cause I knew I spent too much making sure it wasn't. I'll look around for that cable since you think it might be different, but I did buy a Corsair replacement. And yeah I used three discrete cables for the three slots.

I'll look into undervolting, I've seen a lot of people mention that on Reddit and such.
Not sure whether the one you bought separately is different from those 2 you had included (iirc not all Corsair cables have inline caps), but they should be marked as pcie/gpu cables + they are pretty stiff and have a black heat-shrink sleeve under which are those capacitors. Sometimes the sleeve has uneven shape or visible bumps because of the caps inside. They are rather useful for high transient gpus because they dampen those short spikes/ripples, which could otherwise lead to crashes.

Undervolting is a simple thing that's around for a long time, but since each gpu is different, it takes some trial and error to find the sweetspot for your sample, your build and your needs. I don't have the same card, but there are probably some baseline clocks/voltages/power targets out there on web/forums, something to give you a headstart at least. With a top of the line card, you probably don't need to touch power target at all and set the clocks at stock value (as a baseline or the advertised oc clocks the card came with), and undervolt from there.

If you want a simple one-click undervolt, try that automatic one, but it's nowhere close to the ideal outcome, but still better than out of the box and note the difference in stock core V vs auto-undervolt V. Sometimes the auto result is pretty underwhelming though, like -10 mV etc. My old Vega was the king of undervolting, -148 mV on the core for max power state, while maintaining the same power target and boost clocks actually went up under full load because it ran cooler.
Hmmm... more performance because of lesser Volt? I guess AMD must be stupid not to use this trick... if it actually truly is working.

I can not talk about the GPU. Regarding ZEN4 it is well known that a lower power target will result in rather minor performance loss but much better efficiency.

There is some difference with the ZEN4 3D because they already are "power optimized", and running cooler, so it may not help a lot anymore.

Main reason why the power target is nowadays generally very high is because there is a hard competition going on and every bit of performance will count. Most customers will never need it, so it is usually useful going down.

Weird situation because i remember the time when the Intel CPUs was way to low clocked with high efficiency... because simply no competition. Nowadays we are not OCing anymore... rather "clocking it down"... weird.

There are some forums/sites, i think soon they have the be renamed into "Underclocking-forum".

Stll, i have to say that the 7900 XTX may overclock pretty well, according to some reviews. But it will need over 400 W and cooling is the key. I think Sapphire is cherry picking on some cards, although with interesting results.

At Death Stranding 4k, the OC 420W+ 7900 XTX was able to beat the 4090, this is crazy. Definitely no "less Volt is more FPS... unless for a fairy tale". Yes some 7900 XTX really OC with fantastic results, but you may hear the card sucking for power... louder than any suckling is capable to.

https://www.youtube.com/watch?v=zWtdWczjFdo
Post edited March 11, 2023 by Xeshra
avatar
Xeshra: If you had a RM850X (850 W) instead (apparently the old one) it could be insufficient because it got only 4x 8 PIN. 2x8 for CPU and 2x8 for GPU, this may become insufficient on many boards. In general 850 W is not safe (minimum is not same as safe or good) for a 7900 XTX, but if you got a 1000 W now, it is certainly "good to use"!
I had a 750 with my old 3070, so I knew I needed something new. The 1000 was only like $40 more than the 850 iirc.


avatar
Spectrum_Legacy: Not sure whether the one you bought separately is different from those 2 you had included (iirc not all Corsair cables have inline caps), but they should be marked as pcie/gpu cables + they are pretty stiff and have a black heat-shrink sleeve under which are those capacitors. Sometimes the sleeve has uneven shape or visible bumps because of the caps inside. They are rather useful for high transient gpus because they dampen those short spikes/ripples, which could otherwise lead to crashes.
I found the OG so it doesn't matter now, but thanks for the advice. They look identical but I swapped them anyway. Somehow the third one got into my old PSU's box, I guess while I was switching parts. I'll return the cable to Best Buy after work on Monday.
Post edited March 11, 2023 by StingingVelvet
avatar
Xeshra: There are some forums/sites, i think soon they have the be renamed into "Underclocking-forum".
Don't forget to about the ' Why ultra settings ' movement and 244 Hz is the new flat Earth society movement ;)
avatar
Zimerius: Don't forget to about the ' Why ultra settings ' movement and 244 Hz is the new flat Earth society movement ;)
I'll use ultra when I can obviously, but 90fps is kind of the new 60fps for me, and I'd sacrifice ultra and ray tracing for it. It's just so much smoother. That's one reason I decided to get a more powerful cards this time around.
Well, the difference from 30 to 60 Hz is insane, the difference between 60 and 120 Hz is still good and above 120 Hz in my mind is not so big anymore but the picture may cleary still improve in movement-picture sharpness. However, it totaly depends on game when it comes to the possible gain. Nowadays i surely recommend 120 Hz but above this is not always, actually pretty rarely, critical.

Wrong statement: That you actually have to do some "fake blurring" because your eyes will always see blurred pictures when moving. This is wrong because your brain is actually the culprit here, not the eyes. Additionally. your brain will simply see as much sharpness it is capable of and it got many tricks doing so (for example it got a small spot of increased sharpness/movement detection, but only a little bit outside it may decrease a lot... the brain is special in many ways, simulation not easy at all). There is no need to "fake" the blurring in any way and the brain in any case can still "see" much more sharpness than some scientist may have in mind. Just do never fake anything, let the brain do its job with all the picture quality possible.

Although, my old Plasma was not the culprit when it comes to "moving-picture-sharpness": When i run X-Blade in 400 FPS, this is possible even on a 60 Hz TV, the picture is just so much sharper... almost no blurring anymore. The TV got some "lag" but this lag is actually useful so the lacking pictures" in between" may not create "lack of information" as long as the GPU is pushing the TV with tons of new frames. Interesting finding but yeah... even on a 60 Hz TV, more frames can be useful for a more sharp picture. Still. the old Plasma-days are now truly over... this year i will buy a new OLED of the newest spec... for real 120 Hz and even better picture quality.

I dunno how much FPS i actually need for a crispy "moving-picture-sharpness", this totaly depends on game aswell. I will find out soon enough and indeed, this is nowadays a important reason why to have good processors. If you have spare performance then.. sure... you can use RT and max settings, but more important is just to have a smooth FPS and i think the time has finally come to step up on FPS for some more smoothness. No need to overkill, but 120 FPS is nowadays simply "the sweet spot" for many fast paced games. This is the first time it is even possible with 4k games, because in the past the hardware was simply to weak. 3090 TI and up is necessary in my view... all the others can not really tackle such a demand for "modern games".

Sure, not much use sacrificing huge GPU power for something you may barely notice or not notice at all, for example when it comes to RT quality or other spec you may not even notice. Settings should be reasonable and it should give someone a better playing experience. If there is viewable lack of quality, then go push the required spec... if no difference able to be detected... then why to waste performance for? Thats why to use strong hardware, to actually boost the stuff noticeable to you, gaining the best possible experience.

The reason why the specs in the past was much lower is simply because we was technically limited, so we had to fake a lot of things and scientists was trying to find some good explanations trying to explain why we do not need more than this. The brain is starting to compensate "the lack", thats why they came to such conclusions. Wrong, because the brain actually is having a big load by "smoothen out" all the fake it gets... ultimately able to cause fatigue or even worse matters. Most people may not be affected or only "over time", but with increased quality of the picture the fatigue will decrease aswell. I still love the good old games but with improved tech, if there is some possibility. This is a huge work but this is the way "how to bring it over to the new era", improving the original purely technically without any changes made in a artistic way.
Post edited March 11, 2023 by Xeshra
avatar
Xeshra: Nowadays i surely recommend 120 Hz but above this is not always, actually pretty rarely, critical.
I've been using a 4k/120hz screen for half a year now and still struggling to see what the benefit is, even after getting 4090 to max it out. This is after playing on a 60hz screens for many years, and I do play fast action games as well as RTS and RPGs and so on. Personally, I like the large 4k screen a lot more than a 120hz refresh, it's barely different from a 60hz screen, imo.

and for the OP, high-end graphics does take some effort to enjoy fully but it gets easier as time passes by. I've always chased the max visual quality, even had triple and quadruple SLI back in the day. These days it's easy, you just get the top dog and that's it, I still remember my triple SLI system weighing well over 30 kg with a dual watercooling loops lol now that was a headache to build and maintain. Now, a single 4090 is sufficient to max out pretty much anything on the market, and as power hungry as it is, it doesn't hold a handle in terms of maintenance required to the SLI systems of yesterday
Post edited March 11, 2023 by anzial
avatar
Zimerius: Don't forget to about the ' Why ultra settings ' movement and 244 Hz is the new flat Earth society movement ;)
avatar
StingingVelvet: I'll use ultra when I can obviously, but 90fps is kind of the new 60fps for me, and I'd sacrifice ultra and ray tracing for it. It's just so much smoother. That's one reason I decided to get a more powerful cards this time around.
A while back i tried out different herz settings for my 144 hz monitor and suddenly Warhammer III was this wreck of a game. This lasted for a couple of days before i managed to find out that 60 fps with 60 hz delivered a different experience then 60 fps with 100 hz and 144 hz do. At least, in Warhammer 3
Post edited March 11, 2023 by Zimerius
I dont know your previous TV but most critical is still the overal quality of the TV/monitor. In term it is lacking you can throw any FPS you enjoy at it... there is not much improvement.

Clearly, the very last Plasma from Panasonic was highly underrated but for me, until today... still a hidden beast, even for gaming. However, after close to 10 years it is finally showing its age and now i want to step up to 4k and 120 FPS, but it would be useless if the general picture quality is still lacking. So i will use the best OLED-TV too... in order to truly see some gain.

More important is a very accurate color reproducton. and many monitors are lacking at this spec... even very epensive "High Hz monitors". So... color reproduction comes first... it should be close to be perfect... all the other stuff is a bonus. 4k first and then 120 Hz second in my mind. 4k is important because at above 50 inch the pixels will start to become to large, so there is more resolution required and even allowing for increaed details in general. 120 FPS is, i guess, mostly linked to the movement of pictures and for the input especially on FPS shooters, there it starts to get useful. If your pictures is moving very fast you will need a lot of "changing information" and in term you lack frames there is not enough of information in order to achieve a sharp and crispy picture. Your brain may be able to compensate, so you may barely notice it... or even your TV is maybe able to interpolate... still, it is surely increasing the quality and to many people it is noticeable for the moving pictures. Standing picture will not matter at all, because the information is not changing... so in theory it works with only 1 FPS, as long as it will not "turn off" this single frame... else you will simply see a black picture in between... all the time... and your brain will be unable to compensate such a huge "lack of information" every single second.

Sure, this is not how it is working: If there is only 1 FPS, the monitor will still run in 60 Hz and only showing you this single frame all the time... if there comes even more frames then it simply will start to get blurry because in between there is almost no information. Sure, your GPU may then use some interpolation... but those tricks (created by AI) are far from being perfect and with some side effects, such as lag.

For standing picture, LCD was a supreme technology because in theory it got a "continous" lightning source, which is perfect for a standing piciure. However, a plasma is a pulsating technology, bad for standing picture but good for moving pictures. OLED is almost the same, with the exception that it can actually reproduce a standing and moving picture the most accurate way possible. The only lack is simply the rather limited lumen, a trait it actually is sharing with plasma. However, the only real advantage a LCD (or QLED) got is its high lumen for HDR. For me to much light... so i can not even appreciate it; better to have the best black levels and the fastest reaction time. The color reproduction of LCD is always inferior and some blue light "emissions" is still a matter, even today; it simply can not fully compete with OLED and even the very old plasma. Usually LCD are used for high Hz PC monitors... because it is tuned in a way allowing them to have a huge Hz performance. To me... clearly useless as long as it is lacking "general quality". I am not a competition gamer who will need every bit of reaction time... i am a cruiser-gamer who is enjoying a impressive picture quality and i do enjoy to explore the game world.
Post edited March 12, 2023 by Xeshra
avatar
Xeshra: There is some difference with the ZEN4 3D because they already are "power optimized", and running cooler, so it may not help a lot anymore.
My 5800x3d really benefits from undervolting - I'm using Kombi Strike 3 (which I understand is pretty much a -30 undervolt) and I get higher boost clocks and lower temps in games (Far Cry 5 tops out at 78 rather than 88 degrees on a top flow cooler). Minimal gaming impact when it comes to FPS but lower power draw and I'm much less worried about temperature.

For a chip that is designed to run super hot under load, it's a good result.
5800X3D designed to run "super hot"? Not sure... the probably hottest AMD chip is the 7950X i assume. Almost unable to cool it down, even by using a micro liquid cooler. However, this CPU can be run on a 120 W or even lower mode with only minor performance loss. I did not do it because in general, according to AMD, high temperatures is not really a issue and under gaming conditions, as long as i can stay below 90 C AMD at least is not seeing any issue.

For the next 2 years i got a warranty... if the CPU really is exploding because of to much heat and inside AMD specs... then i will simply RMA and demand a 7950 X3D. Actually i do not enjoy that there is only 1 die with 3D cache. still... it runs cooler and performs even better. Actually, not even worth it to RMA because there is a ZEN5 released at this time and after a BIOS update it should run fine in the current system. No risk no fun i guess.

Although, i never had any CPU dying... this would be a premiere.

So far there was a lot of burned boards from users but the 7950X is pretty reliable. Luckily i got the "best supported" AMD board... the X670E Master from MSI and so far a pretty foolproof board.

Intel CPUs nowadays are even hotter... the hottest CPUs ever made.
Post edited March 12, 2023 by Xeshra
avatar
Xeshra: 5800X3D designed to run "super hot"? Not sure... the probably hottest AMD chip is the 7950X i assume. Almost unable to cool it down, even by using a micro liquid cooler. However, this CPU can be run on a 120 W or even lower mode with only minor performance loss. I did not do it because in general, according to AMD, high temperatures is not really a issue and under gaming conditions, as long as i can stay below 90 C AMD at least is not seeing any issue.
Generally under a gaming load, the 5800X3D will run close to it's thermal max of 90 degrees unless it has a really good cooling solution. This is basically because the cache is layered on top of the chip.

Something that is expected to run close to thermal max is my definition of "designed to run super hot" - others may have a different definition.
Post edited March 12, 2023 by pds41
I was doing a test run after roughly 5 min of game time, at 120 FPS (so i was putting up some CPU demand which is up to 50% on every single core). The GPU got a bit more than half of the maximum load (Watt is the best indicator, full load is about 430-440 W, 280 W = 68% is a bit above average), the fan had around 1300 RPM, which is still barely audible.

The CPU fan is not monitored yet, but it is tuned in order to run even lower than the GPU fan. In theory the 3 CPU fan can run at very high speed but they will only spin up very high as soon as the temperature is hitting 90 C which is very rarely happening and only for a few seconds, in term the CPUs got almost full load. Usually this should be prevented because the system will start to become less responsible, but usually not even 8 cores are fuilly utilized for most games, so the CPU is not nearly at full load, ever.

The trick? Well, it got a awesome 360° liquid cooler from Alpenföhn:
https://www.alpenfoehn.de/produkte/wasserkuehlung/gletscherwasser-360-high-speed.

https://ibb.co/tL5wC2v

GPU and CPU, both is almost same temperature. There may be more demanding games but at 120 FPS with max settings this game is not light on the hardware. The GPU load will however increase in 4k to almost full load... however, the 3 fan in this case will simply spin faster... not more will happen at all.

74 C for GPU and 69 C for CPU cores (those will not change in 4k because it is FPS-bound). According to AMD 95 C is "a max allowed 24/7 temperature" for this CPU, so i see no issue at all (why to undervolt or underclock?). i think even under full load, i never was experiencing above 80 C i assume. GPU temp limit is set to 83 C (which is standard), the point either the fan will go full speed or the GPU will clock down. 1965 is a high clock on this GPU, so no throttling yet.

Your CPU seems hot to me... indeed, not sure why; i will investigate. Well, designed to run up to 90 C... high but as expected, lower than the 7950X.

Cyberpunk, full setting (with RT) 120 FPS, GPU can still tackle it (4k will be lower than 120 FPS, but in this term i will reduce RT and maybe a few other settings i barely notice). GPU 78 C, 86% load = 350W, fan not full speed yet so it still got no throttling. CPU 75 C which is unexpected but still a good value. In 4k, not even the 4090 @120 FPS can handle full settings, so this is today not doable without dirty tricks, some other day.

https://ibb.co/60Ck8VX
Post edited March 12, 2023 by Xeshra
avatar
Xeshra: The reason why the specs in the past was much lower is simply because we was technically limited, so we had to fake a lot of things and scientists was trying to find some good explanations trying to explain why we do not need more than this. The brain is starting to compensate "the lack", thats why they came to such conclusions. Wrong, because the brain actually is having a big load by "smoothen out" all the fake it gets... ultimately able to cause fatigue or even worse matters. Most people may not be affected or only "over time", but with increased quality of the picture the fatigue will decrease aswell. I still love the good old games but with improved tech, if there is some possibility. This is a huge work but this is the way "how to bring it over to the new era", improving the original purely technically without any changes made in a artistic way.
I'd argue that this only really applies when the graphics are reasonably realistic. For games with 2D graphics that are not even close to realistic, I suspect the brain doesn't try to make it realistic and instead interprets it as like a semi-abstract representation.

I note that the games most likely to give me headaches for visual related reasons are earlier games with "realistic" 3D graphics, like Daggerfall. (Note that camera movement doesn't help matters; that's one reason why I would rather not use mouselook on this sort of game.) But give me Super Mario Bros. 1-3 (or World), and my brain has no trouble keeping up; in fact, I think it's easier to tell what's going on in these games.
Well I'm having crashing problems again. Hmmmm.

It's hard to google these issues because you get a ton of dumb standard advice articles looking for clicks or to sell you bloatware, and then you get a bunch of people talking about how AMD DRIVERZ SUX, and there's very little actual help.

I've done everything I can think of. I made it 80 minutes last time I played, which is better than 10. I dunno.

Someone suggested "dirty power and old houses" somewhere and I do have an old house.