It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Xeshra: However, it is not wise getting a GPU the system simply is unable to handle... and if so, there is some emergency methods required and proper parts.
I'm sure most people skimmed my rant so not surprised this keeps being sad, but for the record my system was plenty good enough. 850 watt Corsair PSU with gold rating, for example. The only problem was I didn't have three PCIE cables, it only came with two and my previous card only needed two.
avatar
StingingVelvet: VENT VENT VENT I AM ANNOYED
Just wait until you get your next electricity bill, with that portable furnace GPU keeping your house/flat/parent's basement warm/hot/scalding.

All this talk about 4K gaming being a "recent thing" does make me smile though - having been playing games at 2560x1600 on a 30-inch monitor (OK, not quite 3840x2160 at just half the number of pixels, but not a huge distance away either) since 2011. And I'm still using the same system (albeit after a few upgrades) with only a couple of hardware hiccups (was SLIing with a pair of GTX580s but one has developed memory issues again so will need to try reflowing it - again).
avatar
dtgreene: One thing: When the computer is mostly idle, or doing basic web browsing (not using anything like WebGL), how much power do these GPUs use?
avatar
Dark_art_: Surprisingly, higher end GPUs don't actually use a lot of power when in idle state, providing they are, say, 2017 and above.

Rant warning not even related to OP: Motherboards usually make bigger difference than GPUs. I don't have comparison data for the AMD X570 chipset aberration but the X470 chipset with all bells and whistles use 10-20W more at idle than a standard B350 (actual value depend on BIOS settings, CPU undervolted in both cases and similar memory speeds) with the same components and same overall performance.
While ̶A̶M̶D̶ Ryzen CPUs are very efficient when loaded compared to Intel, however when taking all the pc parts on account, they still struggle to get the same efficiency on high C-states (idle etc) and light loads. I've seen plenty of Intel desktops (mostly OEM) idling at 15-20W, even old stuff. I can't recall a single Ryzen based desktop idling below 25 (talking about "gaming" typical components like i5's and R5's).

On the desktop GPU side, AMD was and still is a power hog with multi-monitor setups and video decoding. But to be fair, the Steam Deck integrated GPU is awfully efficient. The i5 on the Surface pro 7 needs twice as much power to get similar performance as the Deck.

Integrated graphics is not always better in power consumption, infortunately. For a long time Intel has been stuck on 14nm CPU's and the integrated graphics suffer. A good example is some laptops with dual graphics 'low-power intel UHD' and some 'high-performance nVidia', on many situations using the nVidia GPU can get better overall power consumption than only the intel counterpart, including watching videos...
Do you have any actual figures? Also, any sources? Specifically, has anyone done any extensive testing on power consumption?
avatar
dtgreene: Do you have any actual figures? Also, any sources? Specifically, has anyone done any extensive testing on power consumption?
I think hardware-focused youtube channels like Hardware Unboxed and Gamers Nexus routinely test both peak and idle power consumption of GPUs. Outside of occasional issues in the firmware with specific models/manufacturers things are pretty reasonable. Even desktop GPUs will default to adaptive power settings and lower their clocks when not in use, a system we have laptop dGPUs to thank for.
avatar
dtgreene: Do you have any actual figures? Also, any sources? Specifically, has anyone done any extensive testing on power consumption?
TechPowerUp is probably the best site for that, testing idle, video playback, multi monitor, gaming, and VSync on separately (efficiency figures are on another page too). Example for RX 7900XT. For other GPU's, click "Reviews" at the top, select Category: Graphics Cards and you can filter them there.
I can definitely relate to the clearance issues. When I bought my first three-fan card (Vega 56), it didn't fit in my case. So I drilled out the rivets holding the hard drive bays in place, and took them out (I don't have mechanical drives anymore anyway). Now I have a 6700XT in there which is even longer. So much space! I paid $40 for this case 10 years ago, and it still serves me well. I've used a cheap 500W power supply all along too.
People: Dont be cheap on the PSU, just not worth it! For a GPU the price will always reflect the quality and more quality always will result into more stability.

Just many weird stories here to be honest... kinda confusing guessing whats actually going on.

For a modern GPU such as the crazy powerful 7900 XTX (this one, really?) you really should use a beefy PSU with no less than 850 W, even better above 1000 W. It may look crazy but i truly recommend it!

Maybe the system, with full 3 D load, may draw around 500 to 600 W, however... there is as well some spikes happening and under optimal circumstance your PSU should be 2 times the 3D-load specs... this would be perfect for your system. Never be cheap on a PSU because a really good PSU from Seasonic or Corsair can last more than 10 years, it will be worth every single cent.

I hope no one was using the idle-specs... i am just asking because you never know, at least it is always funny hearing the stories in todays world, it will never be boring, not a single day.

Sure, my 3090 TI is drawing power comparable to a american jet engine i assume, but the PSU is a Titanium 1300 W from Seasonic, basically a Rolls Royce-Power-Supply and it will handle this power hungry beast without any issues. Get a PSU such as this one and there is simply nothing that can cause you trouble with power anymore. A hassle free experience without any time wasted for stability-issues... this should be worth it a lot of bucks!
Post edited March 08, 2023 by Xeshra
I hate gpus as it is tends to be the least standard, most supplier-dictated, hardware component of most PCs. I try to give these crooks as little of my money as I can.

I guess I'm fortunate enough to mostly play games with low gpu requirements. Got an aging GeForce GTX 1070 for gaming and short of an impending collapse in the supply chain forcing me towards a desperate upgrade, I plan to ride on that thing for as long as I can.

I personally get off on beefy second hand used servers (seriously, I got a mini server-farm here), but its more of a geeky ops thing than a geeky gamer thing.

Enterprise-grade server machines are works of art (many of them put your regular consumer-grade PCs to shame when it comes to the ease of swapping components). However, getting ever slightly more familiar with them has made me acutely aware of how badly server manufacturers are scr*wing over the environment by forcing obsolescence (they essentially stop making compatible parts for aging servers and at some point you got to throw the whole thing away and buy a brand new one because you can no longer find dependable working parts which, imho, is crazy as these things are so modular that you could keep one running for decades with a bit of know-how by simply swapping broken components should replacements still be available).
Post edited March 08, 2023 by Magnitus
avatar
StingingVelvet: the card is a bit too long and hits my middle front fan.
Most 3060TI cards fit in my case easily (Fractal Design Define R5), so it was quite a surprise when my MSI 3060Ti Gaming Z Trio arrived. It wasn't easy but I got it installed, with 0 mm to spare*.
At least it's a very cool and quiet card!

*I could remove the upper HDD cage, but I don't want to - there are HDDs in it.
avatar
dtgreene: Do you have any actual figures? Also, any sources? Specifically, has anyone done any extensive testing on power consumption?
As stated Techpowerup is a good source for desktop parts, as is Notebookcheck for laptop parts. Both are enough to have a general idea but the thing is, you have to know how to filter information (like pretty much any source) as sites are not always right, complete, in sync of my findings or test in a complete different way. Doing extensive and accurate power measurement is very time intensive and not always possible without destructive methods to hook up a scope. There's simply no external ways to measure power spikes and popular outlets won't do it unless there's drama, like the RTX 4090 power connector.

Most of my previous post is actually my own findings as I always have computers hooked to power meters, including laptops (when the battery is fully charged they usually respond well to being metered with a mains power meter, with some exceptions). When USB powered, like Raspberrys, they also go through a USB power meter.
Indeed true, power spikes are difficult to be measured because the PSU will have to "smoothen" it out and it will mostly become hidden by a usual "power meter". The fluctuations are mostly internally on a PSU and the capacitors (inside PSU they are huge, do never touch them, it can ba deadly) and other parts will have to act way faster than any power meter is able to measure. Even internal diagnosis tools are not truly accurate, it is simply a matter very hard to become screened... Although, the matter is critical because if the PSU can not handle those power spikes, the system my crash all of a sudden.
Post edited March 08, 2023 by Xeshra
I'm still planning on getting a 4080 in the future, once I'm OK with the price. Should eventually be a decent upgrade over my 1080Ti. Also planning on staying on a 750W power supply. It "should" be just enough for a 4080.
avatar
Dark_art_: Most of my previous post is actually my own findings as I always have computers hooked to power meters, including laptops (when the battery is fully charged they usually respond well to being metered with a mains power meter, with some exceptions). When USB powered, like Raspberrys, they also go through a USB power meter.
Does this work for smartphones?
avatar
Dark_art_: Most of my previous post is actually my own findings as I always have computers hooked to power meters, including laptops (when the battery is fully charged they usually respond well to being metered with a mains power meter, with some exceptions). When USB powered, like Raspberrys, they also go through a USB power meter.
avatar
dtgreene: Does this work for smartphones?
Most of them yes, some have weird charging circuits that skew the readings though. As Xeshra pointed out, external power measurement smooths out the reading a lot, making a analogy with software it's the same as reading a sensor 1000 times and average the result. It works best with continuous loads.
And of course, theres plenty of converters and general losses on the way, so exact values may be a little off but as a diagnosing and setup tool, power meters are awsome.

Edit: A good exemple of a weird charging circuit is my old Surface 3 tablet, that thing only pulls power from the battery and the charging circuit is either on or off.
Note that in some cases, say a smartphone with a 5V 2A (10W) charger, cannot measure more that 10W with a external power meter, even if the smartphone needs more.
Post edited March 08, 2023 by Dark_art_
My first high transient gpu was Vega56 nano board, rated at 210W but it would spike to 360W. I would never trust a psu with 3rd party cable to support a monster like 7900xtx or any similar card like that. Something would melt sooner or later and in that case a very good 1000W PSU is the way to go to be on the safe side. Forget the 80+ whatever sticker as a guide for guality/stability though, it's mostly just marketing. I've had 80+gold acting as junk, crashing intermittently at power/load shifts, and 80+silver taking it balls to the wall for a decade.

If the case performs better with open side panel(s), then there is an issue with airflow or the case volume is really small in combo with low airflow. Sad you had a bad experience with it, but if you ever get amd, you have to undervolt the gpu manually. If you want plug n play "it just works while supported" kind of deal, get nvidia and ideally something with less wattage.
avatar
dtgreene: One thing: When the computer is mostly idle, or doing basic web browsing (not using anything like WebGL), how much power do these GPUs use?
Nothing to be worried about, unless you power your desktop from a battery or depending on solar charge. With a single display, average 4W for HBM-equipped cards, ~10W for current GDDR cards. Worst case scenario ~20W with multi-display config. I don't remember when gpu's began to park into eco mode so nicely, but 2012 kepler gtx 600 series parked already very well to low power states (I think it had like 8-9 of them depending on load).