It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Radiance1979: userbenchmark serves my needs pretty well, i'm not a 4k gamer, i'm even one who sets games to work at max 50% of the powerload of the gpu most of the time, and i did want to try rtx and i came from a 1060 3 gb.

i belief the difference between a 1660 and a 2060super in many situations is around 20 fps.

not to mention all the variables that work different on different grade cards. for example a lightning mode that the 2060super is not equiped to deal with so only uses a fraction of its power to do what it can as opposed to a high tier models such as the 2080 Ti which will calculate this perfect and uses a stupendous amount of its processing power for a effect which you barely notice on screen, this also applies to old and new architectal changes of course

i do agree that a 30% increase is not what you would want to see for your higher end card but on the other side with ever increasing base values a 30% increase becomes higher with every new tier
avatar
ChrisGamer300: I get what you are saying, i'm not a 4K person either, i prefer 1440p 144hz so RTX 3080 looks great in that regard.

As i said though, my main gripe isn't an 30% performance increase because how attractive that is entirely depends on multiple factors like price, sku, power consumption etc. GTX 1080 was around 25% faster than GTX 980 TI give or take and that was a very solid card because it offered a solid increase over past gens flagship (ignoring titan) at a reasonable price.

Turings problem was that 2070, 2080 was only incremental upgrades over GTX 10 series with an inflated price and while 2080 TI is an upgrade it's price was massively inflated especially and even more so outside of US. 2060 falls in the same pitfall as 2080 TI for those looking for midrange gpus.

There is nothing wrong with raising prices and the consumers decides the prices with their spendings, in the end companies exists to make money but this is the reason why RTX 20 series has the reputation it has especially when RTX and DLSS wasn't mature enough to make use of properly.
oops


thank you for your explanation, in my experience the whole point towards general opinion usually faces a rather shortface memory,

back in 2014 just with the release of the 900's i made the awkward decision of buying a gtx 770 for 700 euro's, it was the msi gamer x model and it did actually quite well, if i would have wanted the 970 i would have payed as much as 800 euro's
so really the whole price performance point is really out of my reach of understanding why it even exists.

people expect next to the so called incremental gains, which if looked at closely are huge differences between 2 different architectures + the introduction of a new technology a psychology model where prices remains fixed and only go down no matter what happens
Post edited September 04, 2020 by Radiance1979
low rated
avatar
GamesRater: Offtopic aside: Just wanted to say that I am sorry for some getting inadvertently caught in the low rating sweep by my "fans"(tr0ll st@lkers)......it seems they went with the "scorched earth response" since I changed names.
avatar
Radiance1979: there is a lot going on apparently for the readers, almost everyone gets a full load except for the ' devils advocate '
Note, though, that the replies that got low rated were the ones that quoted me.

I am guessing this probably is like prior times when some who low rated me would hit all posts that replied to me as well...either to save time or discourage people from replying to me.

avatar
Radiance1979: that really depends on the way you are looking at it, the 2060 super improved almost 100% over the 1060 3 gb and maybe 80% for the 6 gb version ( for a cost in prize and almost doubled powerdraw )
Stuff like this is why I often waited for smaller form factor versions of new cards to drop, which used less power. :)

avatar
Radiance1979: people expect next to the so called incremental gains
Compared to the HW some such companies could release(the stuff they have ready for mass production if they wanted to, or close to it, if the leaks are accurate), the releases of some HW is actually incremental in comparison to that.
Post edited September 04, 2020 by GamesRater
avatar
Radiance1979: there is a lot going on apparently for the readers, almost everyone gets a full load except for the ' devils advocate '
avatar
GamesRater: Note, though, that the replies that got low rated were the ones that quoted me.

I am guessing this probably is like prior times when some who low rated me would hit all posts that replied to me as well...either to save time or discourage people from replying to me.

avatar
Radiance1979: that really depends on the way you are looking at it, the 2060 super improved almost 100% over the 1060 3 gb and maybe 80% for the 6 gb version ( for a cost in prize and almost doubled powerdraw )
avatar
GamesRater: Stuff like this is why I often waited for smaller form factor versions of new cards to drop, which used less power. :)

avatar
Radiance1979: people expect next to the so called incremental gains
avatar
GamesRater: Compared to the HW some such companies could release(the stuff they have ready for mass production if they wanted to, or close to it, if the leaks are accurate), the releases of some HW is actually incremental in comparison to that.
still this is about expectations, you can't expect the prototype to be readily available nor is it wishful to create yet another unbeatable product which increases an already evergrowing gap between hard and software. Look at dx 12, control is the first title i've seen it used in a proper sense and its what, already out like 6 or 7 years ...... though maybe this is an example of where hardware needed to get in the range of the software.... ;)
low rated
avatar
Radiance1979: still this is about expectations, you can't expect the prototype to be readily available nor is it wishful to create yet another unbeatable product which increases an already evergrowing gap between hard and software.
True, but some of those models were leaked long before they came out, iirc....i.e. they held them back to squeeze more money with other models out of people, most likely.

It's also sadly the same thing in the GPU and CPU markets as it is in markets like mobile phones/tablets and other electronics.

(That and other things like planned obsolescence are just two things about bigger companies that irk me)

avatar
Radiance1979: Look at dx 12, control is the first title i've seen it used in a proper sense and its what, already out like 6 or 7 years ...... though maybe this is an example of where hardware needed to get in the range of the software.... ;)
Speaking of Control: Gotta get back to that soonish.

(I keep putting some games on hold when I get some new ones, ya see)
Also this just in MSI are using spare turing and AMD rDNA stock and retooling them as Miner's cards

Kitguru article Here!
Post edited September 05, 2020 by fr33kSh0w2012
avatar
Darvond: So, anyone have a suggestion for a replacement for the R7 240? Preferably single slot and able to be crammed into an Optiplex 9010 mini-tower?
avatar
Trooper1270: I have an Optiplex 7010 SFF (which is smaller than the mini-tower), and the newest/fastest low-powered card that it supports (and will accommodate) is the Nvidia Geforce 1050Ti low-profile card. But that card is sadly dual-slot, but it fits like a glove, and works like a dream.
Have you looked at the 1660 minis?
avatar
GamesRater: Tbh I wish game makers would worry more about story(length and also quality) and less about the most bleeding edge graphics.
avatar
StingingVelvet: It depends for me. I think the realism chase is a silly one in most genres and settings. I'd usually take something like Dishonored over a hyper-realistic game. If you use a bit of style, a little exaggeration, then you don't need to chase realism all the time and it probably looks cooler. nVidia showed this marble demo and people are freaking out about how real it looks, and it's neat as a tech demo, but for a real game I'd take a stylized and well designed look over stuff like that in a heartbeat.

However I do care about things like draw distance, the size of the world, the number of NPCs on screen at once, etc. etc. I think that stuff is important for all games, and hopefully the consoles having better processors and memory this time allows it to happen on PC more.
Full on agree. Graphical fidelity is less important than ability to render an actual city, or world containing multiple cities without being disappointing from an immersion perspective. I don't particularly care about realism or photorealism if there isn't a distinct art direction.
Post edited September 05, 2020 by LiquidOxygen80
avatar
LiquidOxygen80: Full on agree. Graphical fidelity is less important than ability to render an actual city, or world containing multiple cities without being disappointing from an immersion perspective. I don't particularly care about realism or photorealism if there isn't a distinct art direction.
reminds me of one of the comments for the ps5 tech demo

"if we had the time to spend on implementing that level of detail"

https://www.youtube.com/watch?v=d8B1LNrBpqc&pbjreload=101
avatar
StingingVelvet: It depends for me. I think the realism chase is a silly one in most genres and settings. I'd usually take something like Dishonored over a hyper-realistic game. If you use a bit of style, a little exaggeration, then you don't need to chase realism all the time and it probably looks cooler. nVidia showed this marble demo and people are freaking out about how real it looks, and it's neat as a tech demo, but for a real game I'd take a stylized and well designed look over stuff like that in a heartbeat.

However I do care about things like draw distance, the size of the world, the number of NPCs on screen at once, etc. etc. I think that stuff is important for all games, and hopefully the consoles having better processors and memory this time allows it to happen on PC more.
avatar
LiquidOxygen80: Full on agree. Graphical fidelity is less important than ability to render an actual city, or world containing multiple cities without being disappointing from an immersion perspective. I don't particularly care about realism or photorealism if there isn't a distinct art direction.
I agree with this as well. It seems like we're well into a period of diminishing returns for video game graphics. I mean, looking back at Skyrim, the graphics for that were pretty darn good and still hold up well. I would argue graphics haven't advanced all that much since then. It has mostly been tweaking and polishing, but no really huge advances. Most games today don't look that much better, and Skyrim was 9 years ago!

For me, Skyrim-level graphics are good enough. I don't really need my video games to be any better than that graphically. I'm aware this is probably a minority viewpoint (since I am perfectly happy playing ancient games from the 80s/90s), but for me graphics have never really been a high-priority. Sure, incrementally better graphics are nice to have, but imo they're not $500+ nice to have.

I am a bit concerned about the 'graphical chase' and the amount of time and resources that seem to be being thrown at these ever diminishing graphical returns. If it distracts from things like gameplay, content, writing, AI, player freedom and choice, playtesting and bugfixing, then I'm not sure it's really worth it.
avatar
Time4Tea: For me, Skyrim-level graphics are good enough. I don't really need my video games to be any better than that graphically. I'm aware this is probably a minority viewpoint (since I am perfectly happy playing ancient games from the 80s/90s), but for me graphics have never really been a high-priority.
Ironically, with that kind of thinking Skyrim would've looked like Morrowind (or Daggerfall, or... depending on somebody else's 'good enough') ;-)
avatar
Time4Tea: For me, Skyrim-level graphics are good enough. I don't really need my video games to be any better than that graphically. I'm aware this is probably a minority viewpoint (since I am perfectly happy playing ancient games from the 80s/90s), but for me graphics have never really been a high-priority. Sure, incrementally better graphics are nice to have, but imo they're not $500+ nice to have.
Skyrim is a good example of what I was saying, because the detail level and such is about as good as I'd never it to be, but the draw distance and number of characters typically on screen at once are not. In other words I'd rather have Skyrim's level of graphical fidelity with super far draw distances and a hundred NPCs on screen than have "better looking" games with the same old problems.
avatar
Time4Tea: For me, Skyrim-level graphics are good enough. I don't really need my video games to be any better than that graphically. I'm aware this is probably a minority viewpoint (since I am perfectly happy playing ancient games from the 80s/90s), but for me graphics have never really been a high-priority. Sure, incrementally better graphics are nice to have, but imo they're not $500+ nice to have.
avatar
StingingVelvet: Skyrim is a good example of what I was saying, because the detail level and such is about as good as I'd never it to be, but the draw distance and number of characters typically on screen at once are not. In other words I'd rather have Skyrim's level of graphical fidelity with super far draw distances and a hundred NPCs on screen than have "better looking" games with the same old problems.
Yeah, I'm a big fan of higher view distance as well. And many of those issues you mention aren't things that can really be solved by throwing more GPU power at them anyway, because they are more CPU-limited. Putting more NPCs on the screen at once creates more issues for the engine than just rendering (e.g. physics, AI). View distance is also something that can't really be solved by GPU 'brute-forcing'. Improving it needs different solutions, like Level of Detail, more intelligent batching of objects and engine support for CPU parallelism.
Post edited September 06, 2020 by Time4Tea
This just in:

https://www.youtube.com/watch?v=EpDtqflQHgA
low rated
avatar
Time4Tea: I'm aware this is probably a minority viewpoint (since I am perfectly happy playing ancient games from the 80s/90s), but for me graphics have never really been a high-priority. Sure, incrementally better graphics are nice to have, but imo they're not $500+ nice to have.
Well said.

Also i'm of the same opinion.....I like good graphics somewhat, but other things matter more to me when it comes to games.
==================

The one guy: "Uhhh...even if the loan doesn't get approved (for the card) I have two kids!! Who needs two anyway"

o.0

Second guy: I am already uhhh.. surviving on 1 Kidney. That's good too, I pee less, so more game(s)!

Lol

(I am guessing it's a parody with fake subs, but still pretty funny. Also IntelHD Guy was the best of the bunch :))
Post edited September 06, 2020 by GamesRater
More seriously this time:

https://www.youtube.com/watch?v=0TcgSc9yQy0

https://www.youtube.com/watch?v=xMvrT3vrEbs

Basically, do you need to buy a new PC (motherboard and power supply at least) for the 3xxx cards?
avatar
timppu: More seriously this time:

https://www.youtube.com/watch?v=0TcgSc9yQy0

https://www.youtube.com/watch?v=xMvrT3vrEbs

Basically, do you need to buy a new PC (motherboard and power supply at least) for the 3xxx cards?
GPU is the new motherboard , you just buy accessories to it.