It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Sooo, to the original question: how do I check the power consumption and the performance? Just reading some specs?

I play on a laptop (a couple of laptops actually) so I guess I don't have to provide the wattage of each and every component, just the consumption of the whole unit? To get accurate readings, I guess I could put a kW reader to the power source so it shows how much electricity the laptop is using in reality, while running some GeekBench or similar, which would then give an accurate performance reading?

Then we could accurately rate different PCs in this thread, and the winner gets a trip around the world (on his/her own expense, of course).
avatar
Abishia: your graphics card consume as much power as my whole system
(240W)

also this topic ain't about random PC lists but about getting the most computer power squeeze out of juicy joules
Anything between 12W and 240W. I'm not playing the most demanding games 24/7.

Adding up the numbers says nothing about real life power usage.

https://versus.com/en/inno3d-geforce-rtx-4060-ti-twin-x2-16gb -> 165W
Post edited October 28, 2023 by teceem
avatar
Zimerius: what???? how can you be so careless in this age and time ? The ice caps are melting?!?!!
avatar
timppu: Depends how your electricity is produced. (Yes yes I know you are sarcastic, but I respond anyway.)

Here where I live, in 2021 86% of the consumed electric power was produced carbon-free (including e.g. wind, water and nuclear power), and 53% of the total was renewable energy (ie. not including nuclear power).

The part that is produced by burning something (e.g. unrecyclable community waste; we don't fill our landfills here with unrecyclable waste, we burn them for heat and electricity, even waste from other European countries), the electricity produced there is mostly a side-product of the heating plants. So we don't burn coal, oil, gas or even unrecyclable community waste just to produce electricity.

I've had an electric car for two years now so I've done my part. If my math is correct, I've already saved 456 pandas and 5604 hedgehogs by driving an EV, except those couple of hedgehogs I ran over. I haven't saved any Sumatran gorillas because I hate the f*ckers, obnoxious hairy beasts.

(If you are thinking "Wait, but there are no gorillas in Sumatra?"... now you know why.)
why do people think energy consume directly on Emissions i can't care less what happens but what i do care about my wallet.

i must pay 0.41 Euro cents per KWH so reducing consuming is directly money in the pocket.
the dude with his 426W computer compare to my system i save 147W a hour (that's 7H per KWH) so if both systems are running 14 hours i save almost 1 Euro meaning after a full year i already made 365 euro's by just using a less consuming computer that in theory if push forward my computer is free in 4 years.

so keeping a eye on consumstion matters.

avatar
Abishia: your graphics card consume as much power as my whole system
(240W)

also this topic ain't about random PC lists but about getting the most computer power squeeze out of juicy joules
avatar
teceem: Anything between 12W and 240W. I'm not playing the most demanding games 24/7.

Adding up the numbers says nothing about real life power usage.

https://versus.com/en/inno3d-geforce-rtx-4060-ti-twin-x2-16gb -> 165W
110W
from this source
https://www.tweaktown.com/reviews/10461/inno3d-geforce-rtx-4060-twin-x2/index.html

energy consuming is always average
peak watts is only microseconds.
Post edited October 28, 2023 by Abishia
avatar
Zimerius: what???? how can you be so careless in this age and time ? The ice caps are melting?!?!!
avatar
timppu: Depends how your electricity is produced. (Yes yes I know you are sarcastic, but I respond anyway.)

Here where I live, in 2021 86% of the consumed electric power was produced carbon-free (including e.g. wind, water and nuclear power), and 53% of the total was renewable energy (ie. not including nuclear power).

The part that is produced by burning something (e.g. unrecyclable community waste; we don't fill our landfills here with unrecyclable waste, we burn them for heat and electricity, even waste from other European countries), the electricity produced there is mostly a side-product of the heating plants. So we don't burn coal, oil, gas or even unrecyclable community waste just to produce electricity.

I've had an electric car for two years now so I've done my part. If my math is correct, I've already saved 456 pandas and 5604 hedgehogs by driving an EV, except those couple of hedgehogs I ran over. I haven't saved any Sumatran gorillas because I hate the f*ckers, obnoxious hairy beasts.

(If you are thinking "Wait, but there are no gorillas in Sumatra?"... now you know why.)
yea, yea, yea.. u guys in the north are the poster boys of renewability ....... all of europe and especially us Dutch should take an example of the northerners

but..

also think about heat
Post edited October 28, 2023 by Zimerius
my epeen is bigger then your epeen
avatar
Themken: All I will say is I think it is silly how we live in a world where graphics cards are now allowed to draw up to 600W and PC processors in excess of 300W.
As long as you do not need to compensate this condition with the amount of food you eat, it is probably still not "to much". What is rather worrying me is the rather increasing amount of people "overestimating" their ability maintaining certain luxury goods which may lead to a financial crisis on some more critical spots (an addict obviously is unable to clearly see). But as long as it is truly affordable without taking a hit on critical spots, all will be good.

Anyway, people should be stick to the rule of the topic... i know you enjoy to be "creative" but ultimately everyone got different interests and many people simply enjoy their "well tuned system" good enough in order to tell others about... why not.

Ah okay... me too:

CPU: 7800 X3D
GPU: MSI 3090 TI Gaming X Trio
RAM: 64 GB DDR5 @ 2x 5600/36 (Hynix M)
MB: MSI X670E Ace
OS SSD: Kingstone Fury 2 TB
Game SSD: Samsung 990 Pro 2 TB
Game SSD 2: Kingstone Fury 4 TB

I cant tell the exact power consumption of any part... i think it is rather a miracle having exact knowledge about it. Especially because some parts, for example a "bursting SSD" can maybe draw up to 5 W or even more during peak-data-transfer, but as soon as the SSD is done the consumption can be close to none.

The other discrepancy is the "full load" vs. "typical-3D-load using FPS sync" (in my case currently 60 FPS): Of course the difference can be big but usually i try not to use all the ressources the GPU got, and the CPU is never at full load during 3D-load anyway... with the exception if i load a demanding game... which is a rare "burst condition".

So ultimately it is nearly impossible "tracking down every single parts real consumption". What clearly can be said: SSDs and HDDs, not even the RAM, use almost no power... so nothing i could "show of" in a positive way... simply without much sense tracking it.

What really matters are the processors and to some extend the motherboard can use some good power too (probably more than all the drives added together, including RAM). Remember: Even a single USB C slot can sometimes deliver up to 100 W, so the MB need to be able to "hand out a lot of Watt under certain circumstances".

Well, we can now make some funny maths here, the maths THEORETICALLY possible if my system is under the max load possible (by any means... abusive and never realistic, yet not damaging in short terms).

Okay, lets try to guess the highest power drain possible:

GPU: 455 W (highest ever measured)
CPU: 100 W (not very sure but i never was able to detect more than this)

Motherboard, now this is special because in this case i will now attach a powerbank on every single free (mouse and keyboard is taken already) USB C slot with fast-charge... and those 10 Gbit slots in theory are able to provide up to 100 W each. Now i am not sure how much the system will be able to handle if every slot is "abused" at once... but the high-end-MB got excessive capabilities so a lot of abuse is possible. Well i will reduce the output to 60 W each, which should be possible without overload. So anyway, my maths would be like this: Basic consumption 100 W + 9x 60 W at every 10 Gbit USB C= 540 W, so finally we have 640 W (i know, i am dirty rich so i have to load 9 powerbanks at once...).

Motherboard: 640 W
Drives (SSD, HDD): 50 W
RAM: 10 W

Finally = 1255 W, theoretically the maximum load possible. Practically i can not tell... no lab here.

Does it work: In theory yes, the PSU is a 1300 W Seasonic Titanium (the Rolls Royce in IT terms) which can safely provide up to 1500 W till it may shut itself down.

However: The realistic 3D load using a 60 FPS sync is more like this:

GPU: 300 W
CPU: 50 W
Motherboard: 100 W (all the coolers included)
Drives (SSD, HDD): 20 W
RAM: 5 W

= in real terms 475 W

So people see, it is not that crazy as long as the system is used in a "fine manner"... not even on hardware that could theoretically make some wall-cables burn.

And performance, well... i think ranked at the 3. spot (behind 4090 and 7900 XTX, any other hardware may not outpace it).
Post edited October 28, 2023 by Xeshra
avatar
amok: my epeen is bigger then your epeen
or someone build even a better system with even less drain on watts.
this forum is targeted for gamers you expect to find topics about cars or something?.
avatar
Xeshra:
please, go ahead and use this calculator

https://outervision.com/power-supply-calculator


avatar
amok: my epeen is bigger then your epeen
avatar
Abishia: or someone build even a better system with even less drain on watts.
this forum is targeted for gamers you expect to find topics about cars or something?.
That is neigh to impossible! :-p

with the 78003d you already have the almost 'best' gaming cpu out there.....
Post edited October 28, 2023 by Zimerius
avatar
Themken: All I will say is I think it is silly how we live in a world where graphics cards are now allowed to draw up to 600W and PC processors in excess of 300W.
avatar
Xeshra: As long as you do not need to compensate this condition with the amount of food you eat, it is probably still not "to much". What is rather worrying me is the rather increasing amount of people "overestimating" their ability maintaining certain luxury goods which may lead to a financial crisis on some more critical spots (an addict obviously is unable to clearly see). But as long as it is truly affordable without taking a hit on critical spots, all will be good.

Anyway, people should be stick to the rule of the topic... i know you enjoy to be "creative" but ultimately everyone got different interests and many people simply enjoy their "well tuned system" good enough in order to tell others about... why not.

Ah okay... me too:

CPU: 7800 X3D
GPU: MSI 3090 TI Gaming X Trio
RAM: 64 GB DDR5 @ 2x 5600/36 (Hynix M)
MB: MSI X670E Ace
OS SSD: Kingstone Fury 2 TB
Game SSD: Samsung 990 Pro 2 TB
Game SSD 2: Kingstone Fury 4 TB

I cant tell the exact power consumption of any part... i think it is rather a miracle having exact knowledge about it. Especially because some parts, for example a "bursting SSD" can maybe draw up to 5 W or even more during peak-data-transfer, but as soon as the SSD is done the consumption can be close to none.

The other discrepancy is the "full load" vs. "typical-3D-load using FPS sync" (in my case currently 60 FPS): Of course the difference can be big but usually i try not to use all the ressources the GPU got, and the CPU is never at full load during 3D-load anyway... with the exception if i load a demanding game... which is a rare "burst condition".

So ultimately it is nearly impossible "tracking down every single parts real consumption". What clearly can be said: SSDs and HDDs, not even the RAM, use almost no power... so nothing i could "show of" in a positive way... simply without much sense tracking it.

What really matters are the processors and to some extend the motherboard can use some good power too (probably more than all the drives added together, including RAM). Remember: Even a single USB C slot can sometimes deliver up to 100 W, so the MB need to be able to "hand out a lot of Watt under certain circumstances".

Well, we can now make some funny maths here, the maths THEORETICALLY possible if my system is under the max load possible (by any means... abusive and never realistic, yet not damaging in short terms).

Okay, lets try to guess the highest power drain possible:

GPU: 455 W (highest ever measured)
CPU: 100 W (not very sure but i never was able to detect more than this)

Motherboard, now this is special because in this case i will now attach a powerbank on every single free (mouse and keyboard is taken already) USB C slot with fast-charge... and those 10 Gbit slots in theory are able to provide up to 100 W each. Now i am not sure how much the system will be able to handle if every slot is "abused" at once... but the high-end-MB got excessive capabilities so a lot of abuse is possible. Well i will reduce the output to 60 W each, which should be possible without overload. So anyway, my maths would be like this: Basic consumption 100 W + 9x 60 W at every 10 Gbit USB C= 540 W, so finally we have 640 W (i know, i am dirty rich so i have to load 9 powerbanks at once...).

Motherboard: 640 W
Drives (SSD, HDD): 50 W
RAM: 10 W

Finally = 1255 W, theoretically the maximum load possible. Practically i can not tell... no lab here.

Does it work: In theory yes, the PSU is a 1300 W Seasonic Titanium (the Rolls Royce in IT terms) which can safely provide up to 1500 W till it may shut itself down.

However: The realistic 3D load using a 60 FPS sync is more like this:

GPU: 300 W
CPU: 50 W
Motherboard: 100 W (all the coolers included)
Drives (SSD, HDD): 20 W
RAM: 5 W

= in real terms 475 W

So people see, it is not that crazy as long as the system is used in a "fine manner"... not even on hardware that could theoretically make some wall-cables burn.

And performance, well... i think ranked at the 3. spot (behind 4090 and 7900 XTX, any other hardware may not outpace it).
your graphics card along already use 406W to 450W
https://www.techpowerup.com/review/msi-geforce-rtx-3090-gaming-x-trio/29.html

CPU consume 78W i got the same
i use 25W for the mainboard (no specs available) so i have to do some guess work on this.
total drain is around 520W

your PSU loss is also not calculated consider your under the half load your loss efficiency (around 7%)
your computer would consume around 556W still it not bad if it's at 3e place of charts.
Nah, the Titanium is about 5% loss at around 500 W, even below the half load. At the perfect spot it got around 96% efficiency at the 600-700 W mark (550 would be very close). I do not expect it to need above 500 W, especially because the MB probably will need lesser than the 100 W, i was pretty "high" on this guess. However, a X670E chipset is not low power.. 2 chipsets at once need some juice. Of course, it is currently top notch if it comes to the possibilities.

Even if the GPU is becoming a bit more hungry at hitting the 350 W mark it may barely matter because at this point the system will drive itself toward the "sweet spot" which will increase efficiency very close to "only" 4% loss. Sweet spot is around 600-700W and at 350 W GPU the system may hit the 600 W mark, nearly perfect. A very demanding game (Cyberpunk, Witcher 3) may even demand 400 W from the GPU, however, it will then hit the 650 W mark, the perfect spot, so i have not much sacrifice.
Post edited October 28, 2023 by Xeshra
avatar
Abishia: i must pay 0.41 Euro cents per KWH
Is that normal there?

I pay maybe 12c/kWh including taxes etc., and I can charge my electric car for little over 6c/kWh in a couple of places (that is extraordinarily cheap for a public charger, but hey, the charging fee is 2 cents/minute and since my car can AC charge at 22kW power (most EVs can reach only 11kW AC charging speed), the price per kWh goes down to that level...), and for free at work of course (for now, this will change at some point and the tax man will consider free charging at work as extra salary for which one has to pay some extra taxes...).
Post edited October 28, 2023 by timppu
avatar
Abishia: i must pay 0.41 Euro cents per KWH
avatar
timppu: Is that normal there?

I pay maybe 12c/kWh including taxes etc., and I can charge my electric car for little over 6c/kWh in a couple of places (that is extraordinarily cheap for a public charger, but hey, the charging fee is 2 cents/minute and since my car can AC charge at 22kW power (most EVs can reach only 11kW AC charging speed), the price per kWh goes down to that level...), and for free at work of course (for now, this will change at some point and the tax man will consider free charging at work as extra salary for which one has to pay some extra taxes...).
it's considered cheap, prices are around 0.39 cents to 0.75 cents per KWH.
depending on contract duration and when you sign the contract.
Post edited October 28, 2023 by Abishia
avatar
Abishia: your graphics card along already use 406W to 450W
https://www.techpowerup.com/review/msi-geforce-rtx-3090-gaming-x-trio/29.html
It can even go a bit higher than Furmark using a "game" with full RT enabled... at a bit above 450 W. So far only Metro Exodus and Cyberpunk was able to "pass" the 450 W mark.

However, i already told that this is not a typical load, it is rare and special. Typical load is more like 300 W on most pretty modern (above average in demand) games i play, for example Horizon Zero Dawn.

Furmark is not realistic anyway, however... yes, there are 2 games i own who can match the Furmark-load. Sure, the system can handle it, as long as cooling and power is alright, and the system will hit the perfect efficiency... so this is still not a issue.

Review sites are usually testing the games "unlocked" at the highest FPS possible... no matter "the cost"... this is done in order to measure the real differences between the GPUs. Although, in "normal" gaming conditions this is most likely a unnecessary waste. It is way better to sync the FPS to your screen (for huge TVs it is either 60 or 120 Hz = FPS) which will also prevent any tearing issues and lower the GPU load. Generally any game can be played at 60 FPS with the exception of shooters... those may need 120 FPS or more, else your gameplay can become a issue.

Of course you can always go higher if there is sufficient ressources but in general, new games will very quickly match the performance of new GPUs and thus... 120 FPS is always challenging, while 60 FPS usually is perfectly doable using high end GPUs. On shooters... usually the details have to be lowered a little... else there is as good as no GPU able to provide 120 FPS or more with the exception of classic shooters of course.
Post edited October 28, 2023 by Xeshra
avatar
Xeshra: Review sites are usually testing the games "unlocked" at the highest FPS possible... no matter "the cost"... this is done in order to measure the real differences between the GPUs. Although, in "normal" gaming conditions this is most likely a unnecessary waste. It is way better to sync the FPS to your screen (for huge TVs it is either 60 or 120 Hz = FPS) which will also prevent any tearing issues and lower the GPU load. Generally any game can be played at 60 FPS with the exception of shooters... those may need 120 FPS or more, else your gameplay can become a issue.
Some games, particularly those that don't decouple graphics and physics frame rate, may not function properly if the framerate is uncapped. (For example, I hear that Skyrim breaks down if you try this.)

This can also happen at low frame rates, where you're more likely to see things like collision tunneling (an entity moves so fast that it can cross a barrier before the game's collision detection is able to catch it).
AMD custom APU 0405 with integrated graphics
16GB RAM

10 to 25W at the wall including SSD, screen, speakers and gamepad.
avatar
timppu: I've had an electric car for two years now so I've done my part. If my math is correct, I've already saved 456 pandas and 5604 hedgehogs by driving an EV, except those couple of hedgehogs I ran over. I haven't saved any Sumatran gorillas because I hate the f*ckers, obnoxious hairy beasts.

(If you are thinking "Wait, but there are no gorillas in Sumatra?"... now you know why.)
Your post is hilarious, thanks for the laugh. +1
Post edited October 29, 2023 by Dark_art_