It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
It will most likely become the "new behemoth" of GPUs, and this is not even the full potential because the "full chip" will most likely leech a nuclear reactor dry if used on a "server farm". No matter what, the performance is probably so big, AMD is probably already looking "pale" on their skin, not red anymore. Nvidia themself has probably a new color... not pale or dark green anymore, rather nuclear-poison-green.

https://videocardz.com/newz/nvidia-geforce-rtx-5090-and-rtx-5080-specs-leaked
https://www.youtube.com/watch?v=EMgVlQQ6SCo

So, well, apparently 32 GB GDDR7, half of my current system RAM (which is way over the top in most, but not all, cases) over a 512 bit bus, creating a devastating 1800 GB/s speed. In comparison: DDR5 system RAM got about 100 GB/s (dual channel) and a PS5 about 400-600 GB/s.
21760 cores, beating the 16384 cores of a 4090 by a good margin. PSU will need to provide up to 600 W... currently 450 W is the maximum for either 4090 or 3090 TI. It is now using PCIE 5.0 interface which means, in theory boards providing it may have a advantage.

The performance in overall i would say is +70% vs. 4090 and +120% vs. 3090 TI. Yet at the cost of about 40% more power i assume, so the efficiency is... hard guessing now, about 30% higher. The huge performance increase surely is mainly because of a much bigger TDP and just partially the better efficiency. No matter how it is done, fact is... it will demolish any GPU of the past if it comes to raw performance. I guess, the performance is so unruly, any CPU will have issues "keeping up with", so i hope AMDs next gen CPU is a better upgrade and not focussed on efficiency only.

Price is probably 2500 USD but more realistic getting it at 3500 USD as soon as the scalpers are getting their hands on. I do not expect it to be "realistically" available in 2025, rather 2026, with the exception of Porsche-drivers. I could be wrong, just my speculation here, based on the past... crazy... experiences.

I would not really bother with the 5080 because it will be the same situation we had with the 4080, and after many years... there will be a improved 5080 TI/Super or whatelse which is finally "worth it", but this will take a eternity.

The full chip, at 24576 cores is probably reserved for server only, on a "Titan GPU" and surely for the personal mining PC of the "big chief", which is even working for his coins while he got to clean some more toilets. I suggest Valve may get it too... with so much cash the big chief is already having a sweet dream.
Post edited September 28, 2024 by Xeshra
My next upgrade is going to be around the 70 series at the soonest. I expect to stay on a 4080 for at least 5 more years, if not longer. Not planning on moving from 1440p and I find 240 Hz to be enough.

Not to mention that eating 600 W of power is frankly quite ridiculous.
Well, in general i use Vsync at 60, so a 3090 TI is generally rather at around 200-300 W in usual gaming scenarios (demanding games only of course), not the full 450 W. 450 W is a peak value, so the FPS might never drop... kinda a performance buffer which is... if i use my settings properly, 25-50% above the "average value".

With a 5090 it would be same, the average number is more like 300-400 W (about 1/3 higher) while offering around 120% more performance. The Vsync would be set to 120 or 144 (depends on the native Hz of the TV, 144 is currently maximum for big TVs) as a new standard, which was previously unrealistic for many modern games at full settings.

It is generally not a good idea running a game totally "uncapped" because the CPU will lose reaction time because constantly overloaded with demand. This is not good for the response, ideally a CPU should never be at full load... preferably less than 80%, so it will stay responsible.

It would have its uses, but yes, the price will be crazy.

Most important is that you can get your games to 60 FPS, 120 FPS is nice to have but not critical, and above 120 FPS is a luxury with the exception of competitive gaming. Some games may run still fine using 30 FPS (depends on the design and genre) but the difference toward 60 FPS is surely way to huge. Above 60 is still a clear difference but for most gamers not critically useful anymore and yes... 120+ is simply competition: 99% of gamers should have no benefit other than "feeling good or able to brag about". There is a high diminishing return at 120+ (need a lot of power but rather low use), not that a hardccore gamer may want to agree on that, it is simply a fact proven by science.
Post edited September 28, 2024 by Xeshra
That power consumption is insane.
The 90 sounds great but yeah, prices will probably be crazy. The 80 on the other hand is kinda disappointing imo. Measly 16GB... could at least have upped it to 24
avatar
Xeshra: There is a high diminishing return at 120+ (need a lot of power but rather low use), not that a hardccore gamer may want to agree on that, it is simply a fact proven by science.
Ridiculous numbers like 360 or 480 - yes, the diminishing returns are so big at those levels it is absolutely not worth it. But I still did notice the smoothness increase, going from 165 to 240 in fast paced games like Unreal Tournament. Although that is going to be subjective. For me personally, the "pointless" boundary is above 240 while it may be above 120 for somebody else.

Simply put
60 FPS = 16.6 ms extra ping
120 FPS = 8.33 ms extra ping
240 FPS = 4.16 ms extra ping
480 FPS = 2.08 ms extra ping
in all the games you play, even single player ones.

Depends where everyone wants to make the cutoff point. But if you combine that with input delays, processing times etc, it can make for a difference that can be felt (in regards to inputs, especially mouse movements), if not necessarily seen. But I both saw and felt it, going up to 240. Although nowhere near as much as when I first went above 60. And judging from that, I can tell that going above would be just a waste of money. At that point, I'd rather go for a higher resolution, which is why I have a 1440p monitor.

Studies mostly pertain to how many "frames" can your brain process, but what matters is the perception of motion smoothness. The faster the motion, like fast paced FPS games, the more you are going to see the difference in practice. Do a movement fast enough, and you are going to perceive it as smoother at 800 FPS than 400 FPS. But again, you are of course going to reach a boundary, where such a fast motion is not realistically achievable in a game.

Look no further than this site:
https://www.testufo.com/
Science can claim whatever it wants (and it mostly concerns itself with the mechanical limitations of our brains, not perception of smoothness). Load it up on a 240 Hz display and tell me you don't see a difference. I recommend picking the "Compare frame rates: Video game motion" in the top right. You can set up 2 frame rates to compare. Once you start feeling like you no longer see a difference (say between 120 and 240), increase the image scroll speed and see what happens. Ever since I swapped to 240, I don't think I have seen motion blur appear in a game. Ever.
Post edited September 28, 2024 by idbeholdME
No relation to the Blackwell games series?
My interest dies right here.
avatar
Xeshra: It will most likely become the "new behemoth" of GPUs, and this is not even the full potential because the "full chip" will most likely leech a nuclear reactor dry if used on a "server farm". No matter what, the performance is probably so big, AMD is probably already looking "pale" on their skin, not red anymore. Nvidia themself has probably a new color... not pale or dark green anymore, rather nuclear-poison-green.
To be fair, AMD had said themselves that they won't even try to compete with high-end GPUs this time around. Looks like it was the right decision because they were never going to look good here.

Can't wait to see the 5090 in use. The fact that the 4090 already has a significant generational upgrade is wild.
The fact that AMD will not even try to counter Nvidia in the high end gpus makes it interesting how will it effect intel's even now small marketshare.
Can I have a video card that uhh, takes 60W, single rail, single slot?
The most powerful PSUs are usually single rail, multi rail is for the weak. The multi rail approach was born out of the idea that a PSU is more failure proof if a certain rail can not exceed a certain value, however, it was decreasing the efficiency as there are simply more parts. Nowadays it is pretty much outdated because most of the PSUs got very good protection, no matter "how big" a single rail may become, and if some power is bad, even few current can already cause damage.

60 W in a single slot? Sure, just ask Nintendo if they got some spare GPU left and ask a GPU-technician if they could solder this stuff on a small "paper-board"... i bet the Switch GPU is only using 15 Watt.

If you still prefer a less exotic approach: Simply go get a Galax GTX 4060 TI https://videocardz.com/newz/galax-releases-noisy-rtx-4060-ti-max-a-single-slot-gpu-with-16gb-memory and put the power limit manually to -70% = 60 W instead of 160 W, there you get your 60 W single slot card and it can somewhat play modern games; but gamers using such a card are usually happy with 60 FPS and 720P, because more than that may only work with pretty old games or indies.

Just because a GPU can run 100 or even more percent does not mean a stingy power-save-person who want to tackle heat or simply become greener can not simply clock it down... it is not the usual gamer demand but why not.

Yet, it just makes less sense, even for freaks because in this term simply go get a Laptop... this is the place with 60 W GPUs inside at full speed.
Post edited September 29, 2024 by Xeshra
avatar
ppavee: The fact that AMD will not even try to counter Nvidia in the high end gpus makes it interesting how will it effect intel's even now small marketshare.
Currently it seems Intel is not entirely at the level of AMD on the low to mid-range GPUs because AMD made some serious advancement on the RT and AI chips, which is already powering the PS5 Pro and it will soon power the RDNA4 desktop cards... probably with true RDNA4 cores for better rasterization as well, but apart from a bit better rasterization i guess the technology is identical to the PS5 Pro, so PS gamers may get it sooner than PC gamers.

In the end i guess AMD may add another 20% efficiency at least on rasterization but this is simply not enough to take on Nvidia because Nvidia may get +30% efficiency as it seems. Nvidia is already about 15% more efficient with the current gen, so AMD could end at around 25% less efficiency at rasterization. Finally it means the gape may increase even more, it seems. AMD simply can not challenge Nvidia anymore.

However... they made huge advancement on RT (reason why the PS5 Pro was making a strong leap there) and AI is now coming to AMD as well, so there is some light at the end of a rather dark tunnel. Not sure Nvidia was improving their RT with the same percentage AMD did, so in the end AMD is probably not much behind of Nvidia with RDNA4 anymore, at least on RT: This traditional weakness is most likely nearly gone, so there is room for hope. I bet, AMD is able to hand out some pretty good priced cards with suitable performance and solid RT for low to midrange gamers... but the high end market is clearly Nvidia only.

Ah yes, Intel Battlemage as far as i know is already inside a integrated Intel SOC, but so far no clue if Intel is gonna release a dedicated card. If so... it may be "on the lower end"... most likely Intel will not be able to beat AMD but they could offer a very challenging priced product, still.

In the end the gape between "casual hardware" and "high end hardware" is increasing even more and we will have the high end GPUs priced like finest madness... but as well with even more "performance headroom", and the "casual market" with very fair priced GPUs but surely with a noticeable performance gape.

The time of "i pay half the price for a 30% slower card" is pretty much over i feel... it is more like "i pay 60/30% lesser for a 60/30% slower card". Reason why the 4090 was so popular, despite its crazy price... it was just so much ahead.

Example:

+60% 5090: 2500 Nvidia only
+40% 5080: 1200 (10% faster vs. 4090) Nvidia only

Above zero competition.

Below is basically the big "kiddie pool" where everyone is trying to beat each others with a lake of different products.

+20% 5070 TI: 700 (30% slower vs. 4090, on par with 4080+)
+20% 5070: 550 (45% slower vs. 4090, on par with 3090 TI, 4070 TI Super or 7900 XT/X or PS5 Pro (RT only))
+25% 5060: 400 (equal to 4070 TI)

+50% RTX 4070 (equal to PS5 Pro (rasterization))
6700 XT (equal to PS5)

5060 vs. 5090: +140% performance but less than 20% the price. AMD/Intel is competing vs. the 5070 TI and lower. AMD will have to add more VRAM and whatelse, perhaps give better RT power, in order to even appeal customers, because the efficiency will be lesser and Nvidia may offer some competitive priced products as well.
Post edited September 29, 2024 by Xeshra
avatar
Xeshra: Price is probably 2500 USD but more realistic getting it at 3500 USD as soon as the scalpers are getting their hands on.
*falls off chair laughing*
And that sort of thing is why I NEVER buy a gpu when it just comes out.
(I usually buy a few gens behind cutting edge and spend $200 max and use it for 3-5 years)

-
avatar
paladin181: That power consumption is insane.
Doesn't everyone have a nuclear reactor or hydro dam at their beck and call? ;D
Post edited September 29, 2024 by GamezRanker
You are indeed correct, everyone got a nuclear reactor and a hydro dam inside their bodies, it is just not strong enough powering a GPU.

For 200 you may get a capable card but it is most likely less powerful than a PS5. Okay... this console is probably already out of your price range, fair enough. How you buy your games i do not know, yet... i guess some deals are almost free... just do not pump up your collection to much, else even 10 coins can become 1000 at some point.
avatar
Xeshra: For 200 you may get a capable card but it is most likely less powerful than a PS5.
I usually wait for deals and discounts and shop around, so the card I end up getting is good enough to run a large chunk of games at decent (mid-high) settings for several years. My Current Card

avatar
Xeshra: How you buy your games i do not know, yet...
Same as above, deals and discounts and lots of patience :)
Post edited September 29, 2024 by GamezRanker