It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
low rated
avatar
Radiance1979: insane insane, if my pc reaches with new bought pricing in mind around 2400 euro's screen and everything calculated i'm a happy man for about 4 years or so, 600 euro's a year ain't that bad
That's not too bad, actually.

Still, personally a bit too rich for my blood....I often spent as little as possible for HW(when I did such).

avatar
Radiance1979: my most expensive gpu cost 650 i belief and for my next system i already decided to go a tad more premium with a better understanding of my uhm hobby pc needs ;)
I would often spend 175 max for vid card or around that.....even if it meant playing games on medium settings for a bit....but for those that like it and have the money: eh, their money, so they should do with it what they want. :)
Post edited September 04, 2020 by GamesRater
avatar
Radiance1979: ..amd in the last line having a lot of cards that push out more tflops then nvidia but somehow nvidia has its own deal going on that turns out quite well for them
AMD used to basically make dual use cards with GCN/ Vega as they didn't really have the money to do gaming and pro/ server/ compute architectural branches. That was why Vega56/64 and other AMD cards were the cards to get for mining, Vega had more raw computational power than a 1080Ti while being 1080 tier in cost; 570 had way, way more computational power than a 1050Ti.

Now with more money coming in AMD has both RDNA (gaming) and CDNA (serious business) branch architectures and a 5700XT has about the same TFLOPS as its equivalent nVidia.

avatar
Radiance1979: as our favorite troll states it is indeed AMD who is the supplier of the console chips. i belief the consoles are rated at 14 teraflops of data output. i belief the 3090 sits at 24 teraflops of output.
TFLOPS for 3090 is weird due to it having tensor and RT specific hardware so it ends up with three FLOPS figures instead of one. nVidia have also fudged raster performance significantly by 'doubling' the number of cuda cores. They aren't really/ practically doubled- they use paired memory so are very limited in what they can do- but the theoretical TFLOPS they generate are on the spec sheet as being real.
avatar
Radiance1979: ..amd in the last line having a lot of cards that push out more tflops then nvidia but somehow nvidia has its own deal going on that turns out quite well for them
avatar
Phasmid: AMD used to basically make dual use cards with GCN/ Vega as they didn't really have the money to do gaming and pro/ server/ compute architectural branches. That was why Vega56/64 and other AMD cards were the cards to get for mining, Vega had more raw computational power than a 1080Ti while being 1080 tier in cost; 570 had way, way more computational power than a 1050Ti.

Now with more money coming in AMD has both RDNA (gaming) and CDNA (serious business) branch architectures and a 5700XT has about the same TFLOPS as its equivalent nVidia.

avatar
Radiance1979: as our favorite troll states it is indeed AMD who is the supplier of the console chips. i belief the consoles are rated at 14 teraflops of data output. i belief the 3090 sits at 24 teraflops of output.
avatar
Phasmid: TFLOPS for 3090 is weird due to it having tensor and RT specific hardware so it ends up with three FLOPS figures instead of one. nVidia have also fudged raster performance significantly by 'doubling' the number of cuda cores. They aren't really/ practically doubled- they use paired memory so are very limited in what they can do- but the theoretical TFLOPS they generate are on the spec sheet as being real.
noticed the vega cards on a tf tabel ranking indeed very high. that notion gave rise to the simple thought " might as well had purchased that vega card when i had the chance instead of turning to nvidia again "
Post edited September 04, 2020 by Radiance1979
low rated
Offtopic aside: Just wanted to say that I am sorry for some getting inadvertently caught in the low rating sweep by my "fans"(tr0ll st@lkers)......it seems they went with the "scorched earth response" since I changed names.
avatar
Radiance1979: here in the Netherlands old stock does have a tendency of disappearing out of stores pretty fast with only some online sellers offering older equipment. this does not apply towards prebuilds and laptops ofc

i wouldn't be surprised if there won't be another consistent drop in pricings, especially not with prebuild and laptop but the occasional odd sale can be expected so i would definitely use some price watching app from now on
Yeah this may be "complicated" on laptop side anyway, as the 3000 series on desktop is so power hungry that they need to be undervolted more on laptops than 2000 and especially 1000 series needed to be. Meaning, the performance disparity between the desktop and laptop versions of 3000 GPUs will probably be considerably greater than with the earlier generations.

https://www.youtube.com/watch?v=HiQLh_vKM5k
avatar
GamesRater: Offtopic aside: Just wanted to say that I am sorry for some getting inadvertently caught in the low rating sweep by my "fans"(tr0ll st@lkers)......it seems they went with the "scorched earth response" since I changed names.
there is a lot going on apparently for the readers, almost everyone gets a full load except for the ' devils advocate '
NVIDIA Admit RTX 2000 FAILURE! RTX 3080, 3070 a Return to Form

https://www.youtube.com/watch?v=1l1rGhR9eJ0
avatar
fr33kSh0w2012: NVIDIA Admit RTX 2000 FAILURE! RTX 3080, 3070 a Return to Form

https://www.youtube.com/watch?v=1l1rGhR9eJ0
that is a bit harsh though, when looking at the rtx capabilities it might, luckily there will be some extra software upgrades on their way for 1000 and 2000 owners which may or may not improve life, with bits and pieces
avatar
fr33kSh0w2012: NVIDIA Admit RTX 2000 FAILURE! RTX 3080, 3070 a Return to Form

https://www.youtube.com/watch?v=1l1rGhR9eJ0
avatar
Radiance1979: that is a bit harsh though, when looking at the rtx capabilities it might, luckily there will be some extra software upgrades on their way for 1000 and 2000 owners which may or may not improve life, with bits and pieces
Harsh ? not at all. Turing was a garbage generation for gaming consumers and was used to push out RTX and DLSS technology to developers so they start to experiment with it and was never meant to popularize it because that comes when it's available to the masses.

In terms of rasterization performance and price where it actually mattered for games in 2018 it was simply a bad buy. Less performance and the price was shifted one tier up so for example 2080 got 1080 TI price. 2060 got the price of a GTX 1070 with considerably less performance then was offered from the Maxwell architecture to Pascal.

2080 TI was the big offerender, anything from 25-35 gains over 1080 TI in 4K and even less in lower resoutions coming in at a MSRP of 999$ but literally impossible to find under 1199$ and even more than that in countries outside of US.

RTX 30 series so far seem a LOT better in performance so that's great but still carry Turing prices, i don't doubt some of the price hikes was due to Turings R&D costs but there was most certainly inflated even more because there was no competition in sight.
Post edited September 04, 2020 by ChrisGamer300
avatar
Radiance1979: that is a bit harsh though, when looking at the rtx capabilities it might, luckily there will be some extra software upgrades on their way for 1000 and 2000 owners which may or may not improve life, with bits and pieces
avatar
ChrisGamer300: Harsh ? not at all. Turing was a garbage generation for gaming consumers and was used to push out RTX and DLSS technology to developers so they start to experiment with it and was never meant to popularize it because that comes when it's available to the masses.

In terms of rasterization performance and price where it actually mattered for games in 2018 it was simply a bad buy. Less performance and the price was shifted one tier up so for example 2080 got 1080 TI price. 2060 got the price of a GTX 1070 with considerably less performance then was offered from the Maxwell architecture to Turing.

2080 TI was the big offerender, anything from 25-35 gains over 1080 TI in 4K and even less in lower resoutions coming in at a MSRP of 999$ but literally impossible to find under 1199$ and even more than that in countries outside of US.

RTX 30 series so far seem a LOT better in performance so that's great but still carry Turing prices.
that really depends on the way you are looking at it, the 2060 super improved almost 100% over the 1060 3 gb and maybe 80% for the 6 gb version ( for a cost in prize and almost doubled powerdraw )

and if you look at these comparisons

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-780-Ti/3439vs2165

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-1080-Ti/3439vs3918

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-Nvidia-GTX-1080-Ti/4027vs3918
avatar
ChrisGamer300: Harsh ? not at all. Turing was a garbage generation for gaming consumers and was used to push out RTX and DLSS technology to developers so they start to experiment with it and was never meant to popularize it because that comes when it's available to the masses.

In terms of rasterization performance and price where it actually mattered for games in 2018 it was simply a bad buy. Less performance and the price was shifted one tier up so for example 2080 got 1080 TI price. 2060 got the price of a GTX 1070 with considerably less performance then was offered from the Maxwell architecture to Turing.

2080 TI was the big offerender, anything from 25-35 gains over 1080 TI in 4K and even less in lower resoutions coming in at a MSRP of 999$ but literally impossible to find under 1199$ and even more than that in countries outside of US.

RTX 30 series so far seem a LOT better in performance so that's great but still carry Turing prices.
avatar
Radiance1979: that really depends on the way you are looking at it, the 2060 super improved almost 100% over the 1060 3 gb and maybe 80% for the 6 gb version ( for a cost in prize and almost doubled powerdraw )

and if you look at these comparisons

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-780-Ti/3439vs2165

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-1080-Ti/3439vs3918

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-Nvidia-GTX-1080-Ti/4027vs3918
2060 was okay but Nvidia skimped on the Vram as they usually do but it's fine for the resolution it was meant to be played and yes 2060 is the best case scenario for turing but it still had the price hike but tbh there wasn't really too many options availble at that point until the RX 5600/5700/XT except for used stuff that i personally don't like.

This still doesn't take away that the generation as a whole was a crapshoot. Userbenchmark is not good and you should look at real time benchmarks over a variety of games.
Post edited September 04, 2020 by ChrisGamer300
avatar
Radiance1979: that really depends on the way you are looking at it, the 2060 super improved almost 100% over the 1060 3 gb and maybe 80% for the 6 gb version ( for a cost in prize and almost doubled powerdraw )

and if you look at these comparisons

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-780-Ti/3439vs2165

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-1080-Ti/3439vs3918

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-Nvidia-GTX-1080-Ti/4027vs3918
avatar
ChrisGamer300: 2060 was okay but Nvidia skimped on the Vram as they usually do but it's fine for the resolution it was meant to be played and yes 2060 is the best case scenario for turing but it still had the price hike but tbh there wasn't really too many options availble at that point until the RX 5600/5700/XT except for used stuff that i personally don't like.

This still doesn't take away that the generation as a whole was a crapshoot. Userbenchmark is not good and you should look at real time benchmarks over a variety of games.
userbenchmark serves my needs pretty well, i'm not a 4k gamer, i'm even one who sets games to work at max 50% of the powerload of the gpu most of the time, and i did want to try rtx and i came from a 1060 3 gb.

i belief the difference between a 1660 and a 2060super in many situations is around 20 fps.

not to mention all the variables that work different on different grade cards. for example a lightning mode that the 2060super is not equiped to deal with so only uses a fraction of its power to do what it can as opposed to a high tier models such as the 2080 Ti which will calculate this perfect and uses a stupendous amount of its processing power for a effect which you barely notice on screen, this also applies to old and new architectal changes of course

i do agree that a 30% increase is not what you would want to see for your higher end card but on the other side with ever increasing base values a 30% increase becomes higher with every new tier
Post edited September 04, 2020 by Radiance1979
avatar
ChrisGamer300: 2060 was okay but Nvidia skimped on the Vram as they usually do but it's fine for the resolution it was meant to be played and yes 2060 is the best case scenario for turing but it still had the price hike but tbh there wasn't really too many options availble at that point until the RX 5600/5700/XT except for used stuff that i personally don't like.

This still doesn't take away that the generation as a whole was a crapshoot. Userbenchmark is not good and you should look at real time benchmarks over a variety of games.
avatar
Radiance1979: userbenchmark serves my needs pretty well, i'm not a 4k gamer, i'm even one who sets games to work at max 50% of the powerload of the gpu most of the time, and i did want to try rtx and i came from a 1060 3 gb.

i belief the difference between a 1660 and a 2060super in many situations is around 20 fps.

not to mention all the variables that work different on different grade cards. for example a lightning mode that the 2060super is not equiped to deal with so only uses a fraction of its power to do what it can as opposed to a high tier models such as the 2080 Ti which will calculate this perfect and uses a stupendous amount of its processing power for a effect which you barely notice on screen, this also applies to old and new architectal changes of course

i do agree that a 30% increase is not what you would want to see for your higher end card but on the other side with ever increasing base values a 30% increase becomes higher with every new tier
I get what you are saying, i'm not a 4K person either, i prefer 1440p 144hz so RTX 3080 looks great in that regard.

As i said though, my main gripe isn't an 30% performance increase because how attractive that is entirely depends on multiple factors like price, sku, power consumption etc. GTX 1080 was around 25% faster than GTX 980 TI give or take and that was a very solid card because it offered a solid increase over past gens flagship (ignoring titan) at a reasonable price.

Turings problem was that 2070, 2080 was only incremental upgrades over GTX 10 series with an inflated price and while 2080 TI is an upgrade it's price was massively inflated especially and even more so outside of US. 2060 falls in the same pitfall as 2080 TI for those looking for midrange gpus.

There is nothing wrong with raising prices and the consumers decides the prices with their spendings, in the end companies exists to make money but this is the reason why RTX 20 series has the reputation it has especially when RTX and DLSS wasn't mature enough to make use of properly.
Long time fan of AMD as a company here, and of their CPUs (with the exception of the laptop I bought 2 years ago all CPUs I've ever bought were AMD), and their GPUs. Started using ATI GPUs in 2000 where they have been my primary GPUs ever since. In that timeframe I've used Radeon (original/7200), 7500, 7000, 8500, FireGL 8800 x3, 9000, 9200, 9500Pro, 9600Pro, 9700Pro, 9800Pro x2, X700 x2, 2600, FireGL X1 256P, and HD7850 x2. Also owned one of each Rage 128, and Mach 64 models previously. Been using the 7850 for 7 years now and just got a second one given to me earlier this year that was a spare a friend had who thought I might benefit from Crossfire. (I didn't)

So I've pretty much experienced almost all of AMD's GPU hardware over time both directly and many models of cards on friend's systems as well. Overall I've always had pretty good results with AMD's GPUs and felt the money spent was well spent when I actually had to pay for them. (Most of the above hardware was free in some way shape or form, I only bought 2 of them.)

Having said all of that... and being a self-admitted AMD fan... I'm totally buying an nVidia RTX 3090 or 3080 for my new PC build some time in the next 3-5 months or so depending on the timing of some things. Also despite the fact that I also had major problems with nVidia during the 2000s that I wont go into as well.

What happened? Well, for starters my needs have changed. What I use the computer for and the GPU has changed over time, and it is not just all for video games. I got heavily into video production and visual effects development over the last few years and the capabilities of nVIdia's video hardware for hardware video encoding, and for visual effects development, stability and other factors are just vastly superior for my current needs to the point where I'd have to stick my head in the sand and sing "LA LA LA, I'M NOT LISTENING" if I were to continue to ignore it these days.

nVidia just dropped a massive nuke bomb with the 3000 series GPUs. I'm also looking forward to seeing AMD's refresh of Navi drop in the coming weeks/months too, but not with anticipation for getting one, but rather with anticipation for them increasing the competition and helping to drive prices down on all GPUs to make what I actually want to get even better priced. :)

Sorry AMD, I still love ya as a company, and my new system will be getting a nice shiny Zen 3 4950X CPU with lotsa love, but it's going to be an nVidia Ampere GPU I'm afraid.

I hope to see both companies duke it out in the coming months and years, while all of us consumers get the benefit from the competition regardless of which companies' products we decide are best for our individual needs.

Despite the global tire fire that 2020 has turned out to be, it's a great time to be alive for gaming and computing.

In closing... GIMME MOAR PIXELZ HELL YEAH! :P
avatar
ChrisGamer300: Turings problem was that 2070, 2080 was only incremental upgrades over GTX 10 series with an inflated price and while 2080 TI is an upgrade it's price was massively inflated especially and even more so outside of US. 2060 falls in the same pitfall as 2080 TI for those looking for midrange gpus.
Yeah. To be fair when the cards came out pretty much every review said it was a small upgrade and ray tracing wasn't ready yet, so if people upgraded anyway that's on them really. It was an obvious "skip it" generation for most people. I have a 2070 because I upgraded my monitor to 1440p and needed a new GPU, but I wasn't happy about it.