Posted December 04, 2022
![amok](https://images.gog.com/5a4ddd5e52655d11e5baf782f13c2013cca6de225d9418db4da0e3576fdc8b07_forum_avatar.jpg)
amok
FREEEEDOOOM!!!!
Registered: Sep 2008
From United Kingdom
![StingingVelvet](https://images.gog.com/b37d5c7bb28d5cee442267f7d9baeef3a34dbb99a77f067e8f45eea3a8c9369d_forum_avatar.jpg)
StingingVelvet
Devil's Advocate
Registered: Nov 2008
From United States
Posted December 04, 2022
Depends on the generation. 2080 vs 1080 was not a worthwhile improvement. Maybe even 980 to 2080. However the 3080 was like an 80% jump. Just plan your upgrades accordingly.
Also, evil or not, developers stop optimizing for older hardware pretty quickly. So does nvidia with their drivers.
Also, evil or not, developers stop optimizing for older hardware pretty quickly. So does nvidia with their drivers.
![tritone](https://images.gog.com/ed507a2870bd387415828d4cafd785980cecc87b9a4e977fd45644e6455d21d7_forum_avatar.jpg)
tritone
Salmon Max
Registered: Dec 2009
From United States
Posted December 04, 2022
It's not always upto you. I got two games for Christmas last year, and when I got around to installing them... they refused to play! Sometimes newer games demand newer processors. Even if the gain is 0 FPS, if the software refuses to work with your old GPU, you're stuck.
In my case, Deathloop (Arkane) wouldn't work on my old GTX660 because it couldn't create a DirectX 12 connection. Eventually you'll upgrade or choose not to play new games.
In my case, Deathloop (Arkane) wouldn't work on my old GTX660 because it couldn't create a DirectX 12 connection. Eventually you'll upgrade or choose not to play new games.
![Dark_art_](https://images.gog.com/046457effa63ade2be668fd985b6d36b3e4113af8a7292da89711dce9f1b1599_forum_avatar.jpg)
Dark_art_
🔴I'm just glad that cows don't fly YO
Registered: Dec 2017
From Portugal
Posted December 04, 2022
![avatar](http://images.gog.com/77665aa6affc77960e6b537ee348222af4d3fecc684f14d10088eae491b41e33_avm.jpg)
Anti-aliasing and Ambient Occlusion usually have a very high speed penalty, it's not unreal to expect a game to run 50 to 100% faster and still look good by tweaking a few settings.
![WinterSnowfall](https://images.gog.com/012b5223b56b6a969af8ff9c131cdb39a6cd6dd1bc99741cfd9b2f9094b5a492_forum_avatar.jpg)
WinterSnowfall
Bastard Lunatic
Registered: Apr 2012
From Romania
Posted December 04, 2022
I still have a GTX 1080 and I'm quite happy with it for 1080p gaming. Of course I'm not into the RTX stuff and think regular non-RT global illumination systems still look darn good enough (see RDR2).
Second generation Maxwell (900 series, released in 2014) introduced support for Direct3D12 Feature Level 12_1, which is all that you need to play any modern game with "RTX off" these days. We're still a ways off before that changes and cards are more likely to hit performance obsolescence before that happens IMHO.
Second generation Maxwell (900 series, released in 2014) introduced support for Direct3D12 Feature Level 12_1, which is all that you need to play any modern game with "RTX off" these days. We're still a ways off before that changes and cards are more likely to hit performance obsolescence before that happens IMHO.
Post edited December 04, 2022 by WinterSnowfall
![neumi5694](https://images.gog.com/47268602e916fb9478cb71a1a902614d9f0303396e0ef1bfc5804f491f8c5a59_forum_avatar.jpg)
neumi5694
Survived the human apocalypse
Registered: May 2011
From Italy
Posted December 04, 2022
doh, completely misread
Post edited December 04, 2022 by neumi5694
![pds41](https://images.gog.com/e2c951de580d9fa526ef7d9d444dd50b5cab5a8b594ddf486bf88f8e7f6badd6_forum_avatar.jpg)
pds41
New User
Registered: May 2009
From United Kingdom
![Namur](https://images.gog.com/301343b16e0efa69f2fbeb70703c53940f92981fb47a6fd8300f9823b4f421c0_forum_avatar.jpg)
Namur
Malkavian
Registered: Oct 2008
From Portugal
Posted December 04, 2022
![avatar](http://images.gog.com/e2c951de580d9fa526ef7d9d444dd50b5cab5a8b594ddf486bf88f8e7f6badd6_avm.jpg)
Prices are crazy still all across the board it seems, the 40 series cards are available around here in stores but prices for the 30 series, and below, either didn't budge or actually went up in some cases.
![P. Zimerickus](https://images.gog.com/f85979e63b852ae06109a1b07ee9febf373765fd5c9144b8581baaf43da7611c_forum_avatar.jpg)
P. Zimerickus
Coffee -He/Him-
Registered: Jul 2013
From Netherlands
Posted December 04, 2022
![avatar](http://images.gog.com/ed507a2870bd387415828d4cafd785980cecc87b9a4e977fd45644e6455d21d7_avm.jpg)
In my case, Deathloop (Arkane) wouldn't work on my old GTX660 because it couldn't create a DirectX 12 connection. Eventually you'll upgrade or choose not to play new games.
In any case, this game is only a forebear for how life will become in the coming 2, 3 years at gaming point.
So, if you are bound to play modern, current day games you will have to upgrade, either sooner or later.
For myself, i've been checking out these 3080 prices as well. We have deals at Megekko, one of NVidia's official Dutch supply channels starting at 800 euro's.
https://www.megekko.nl/product/0/1015342/PNY-Geforce-RTX-3080-XLR8-Gaming-UPRISING-EPIC-X?r=googleshopping&utm_source=googleshopping&utm_medium=cpc&gclid=EAIaIQobChMIze_t4c7g-wIVw9myCh2iHARyEAQYCCABEgLinPD_BwE
Maybe Santa will be nice this year
![rtcvb32](https://images.gog.com/77665aa6affc77960e6b537ee348222af4d3fecc684f14d10088eae491b41e33_forum_avatar.jpg)
rtcvb32
echo e.lolfiu_fefiipieue|tr valueof_pi [0-9]
Registered: Aug 2013
From United States
Posted December 05, 2022
Well that's stupid. Especially if you compare between ultra and max, i see very little visual difference, and between medium and max there's a fair difference but it's easier to identify things on the screen without the extra lighting effects.
I enjoy going from lowest quality and working my way up and finding what i can be happy with (although i do high models/textures since that is a memory issue more than a speed issue generally). Even on low settings games should run and look decent.
AA has it's uses, but for games i don't find it as useful. And Ambient Occlusion I've tried one game with it, and the penalty was too high vs baked shadows/lighting. (Course that game i have at 720 to play at 60fps).
I enjoy going from lowest quality and working my way up and finding what i can be happy with (although i do high models/textures since that is a memory issue more than a speed issue generally). Even on low settings games should run and look decent.
AA has it's uses, but for games i don't find it as useful. And Ambient Occlusion I've tried one game with it, and the penalty was too high vs baked shadows/lighting. (Course that game i have at 720 to play at 60fps).
Post edited December 05, 2022 by rtcvb32
![Lord_Kane](https://images.gog.com/9b9f68133ec12a89389169eadec7420dd5527d8e17a2cd3b99b0e1b22276e296_forum_avatar.jpg)
Lord_Kane
Leaf Kaigai Nikki
Registered: Mar 2009
From Canada
Posted December 05, 2022
I only recently upgraded, from my old AMD (formerly ATI) Radeon HD 5770 to a Nvidia RTX 3080
![LesTyebe](https://images.gog.com/bc132eaa2744c3a77194d1ccc6ca991fa6e6034b119e385d0d67bfa54d44eb40_forum_avatar.jpg)
LesTyebe
obscure reference
Registered: Jun 2019
From United States
Posted December 05, 2022
Without gettting into the topics of relative worth/value, or appointing oneself as judge of what other people do with their money, here is what I got by replacing a 2060 6GB ($350-2019) with a 3080 12GB ($800-2022):
In very modded (heavy ENB, lots of textures) 4k Skyrim (LE and SE) and Fallout 4, went from 24-30 FPS to 75-90 FPS.
Interpret as you will.
In very modded (heavy ENB, lots of textures) 4k Skyrim (LE and SE) and Fallout 4, went from 24-30 FPS to 75-90 FPS.
Interpret as you will.
![Magnitus](https://images.gog.com/89adb24ddf85c197a41b6f7708dc4570e755dcb00bb7261dae2b7d774722d10e_forum_avatar.jpg)
Magnitus
Born Idealist
Registered: Mar 2011
From Canada
Posted December 05, 2022
Might not be a popular opinion, but I try to leverage my hardware until it breaks (or I feel it is likely to break if I can't afford to be without it for close to a month).
There was a big hardware boom until about 10 years ago where people had to upgrade their computers continuously and as much as it was thrilling for the advances in computing it allowed, I'm glad its over, because it was also an environmental disaster (and back then, a smaller percentage of the world could afford computers... I don't even want to think what it would have been like if it had happened nowadays).
It would be much better for our future if people could minimize their hardware usage and leverage the hardware they got for a decade or more.
The only time I "upgraded" my gpu was when I got a desktop gpu in 2018 to try make my laptop work with an external gpu (spoiler alert: It was easy on Windows, hard on Linux). My goal there was to make an external gpu work with gpu passthrough on a Windows vm with a Linux host (in order to play Windows game on a Linux laptop without a discrete gpu in the laptop... it was before the pandemic, I really needed a laptop, but I always got an all-purpose powerful workstation laptop that I wanted to use for 7+ years and I wanted to leverage it as much as possible).
Ultimately, I caved in and built a Windows machine around the gpu I got (thanks to the non-triviality of the endeavor and time constraints because of work). I consider that an excess on my part. Short of another hardware revolution (which I hope doesn't happen or at least, not at the same pace as the last one where most feel compelled to replace a computer that is less than 5 years old), I intend to leverage that Windows machine for gaming until 2028 at least.
There was a big hardware boom until about 10 years ago where people had to upgrade their computers continuously and as much as it was thrilling for the advances in computing it allowed, I'm glad its over, because it was also an environmental disaster (and back then, a smaller percentage of the world could afford computers... I don't even want to think what it would have been like if it had happened nowadays).
It would be much better for our future if people could minimize their hardware usage and leverage the hardware they got for a decade or more.
The only time I "upgraded" my gpu was when I got a desktop gpu in 2018 to try make my laptop work with an external gpu (spoiler alert: It was easy on Windows, hard on Linux). My goal there was to make an external gpu work with gpu passthrough on a Windows vm with a Linux host (in order to play Windows game on a Linux laptop without a discrete gpu in the laptop... it was before the pandemic, I really needed a laptop, but I always got an all-purpose powerful workstation laptop that I wanted to use for 7+ years and I wanted to leverage it as much as possible).
Ultimately, I caved in and built a Windows machine around the gpu I got (thanks to the non-triviality of the endeavor and time constraints because of work). I consider that an excess on my part. Short of another hardware revolution (which I hope doesn't happen or at least, not at the same pace as the last one where most feel compelled to replace a computer that is less than 5 years old), I intend to leverage that Windows machine for gaming until 2028 at least.
Post edited December 05, 2022 by Magnitus
![idbeholdME](https://images.gog.com/3760f2bc32c61c8c4d449030faf766b681f4bfb92bcd3fe377a20eaa1b470688_forum_avatar.jpg)
idbeholdME
Doomed Space Marine
Registered: Jun 2016
From Czech Republic
Posted December 05, 2022
Where is there?
Anyway...
I went from 780 to 1080 Ti and I'm still riding that one out. I probably could upgrade at this point, but with there being so few new demanding games that I'd be interested in, I can still postpone it. I rather opted for a higher refresh rate display, so I'm currently sitting at 1440p 240 Hz. For most games, the 1080 Ti is still perfectly capable. Framerate can drop to double digits in some more demanding titles, but usually sitting in the 100+ FPS territory. The best value card I've been on so far.
I was planning to upgrade to the 40 series, but at this point, I'll probably put it off as long as possible and build an entirely new PC in a couple of years. Or when something eventually stops working.
Anyway...
I went from 780 to 1080 Ti and I'm still riding that one out. I probably could upgrade at this point, but with there being so few new demanding games that I'd be interested in, I can still postpone it. I rather opted for a higher refresh rate display, so I'm currently sitting at 1440p 240 Hz. For most games, the 1080 Ti is still perfectly capable. Framerate can drop to double digits in some more demanding titles, but usually sitting in the 100+ FPS territory. The best value card I've been on so far.
I was planning to upgrade to the 40 series, but at this point, I'll probably put it off as long as possible and build an entirely new PC in a couple of years. Or when something eventually stops working.
Post edited December 06, 2022 by idbeholdME
![ignisferroque](https://images.gog.com/11bb48aa4f788e05edf9a04dd920d8ace51b4fdbd6eb5618d7009e731a4dec28_forum_avatar.jpg)
ignisferroque
Lurker
Registered: Feb 2014
From Germany
Posted December 05, 2022
There are more use cases for GPUs than games, just saying. Speed gains are great of course but going from 8 to 16/24/48 GB VRAM can make a huge difference for productive uses.
Id agree that the differences between two subsequent generations often aren't really worth the upgrade price though... i bought a new GPU recently but If i only needed mine for games id still be (mostly) fine with my RX 480.
Id agree that the differences between two subsequent generations often aren't really worth the upgrade price though... i bought a new GPU recently but If i only needed mine for games id still be (mostly) fine with my RX 480.