It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
xixas:
Well RT for working with in terms of development or for producing advertisments for example, or even just tinkering, sure why not. For games though, as i've seen it as a gimmick since the Titan RTX, and still do now with the 3090, and even still with the 4xxx-series card, i can also see it being a gimmick by the time Nvidia rolls out their 6xxx-series cards.

I recall saying on some other forums back when i had finally ran through a barrage of testing with one and two Titan RTX cards, to "come back around 5xxx-series and take another look at it, but right now, buying for RT is a silly waste of your money". I got absolutely torn to shred by Nvidia fan boys, call all the usual silly names, with the main theme of course being "rabid AMD fan boy" LOL

It was no different when i decided to be brave by telling people who'd bought a 2080 that it wasn't a top tier card at all. Again, they all rounded on me with cries of AMD fanboyism and jealousy!

One moron actually thought to claim that the only reason i have all the Titan GPUs was so i could lie about Nvidia LOL
I thought by uploading a picture of my collection, which is two of each Titan GPU (except the Titan V which i only have one of), that they'd simmer down with their hysterical claims i'm an AMD fanboy, but instead led to that sort of comment. People were even quoting him and saying "This!" LOL

Certainly strange for some of them on that forum to act that way toward me saying that about the 2080, since they surely follow things as well, and must've know the 2080ti was on the horizon.

When the 2080ti did hit the stores, i yet again poked the bear, this time just for fun, and once again told them their 2080ti wasn't the top tier card either.

For those not aware, here's a Fun Fact:
The Titan RTX was not even the top tier chip. It was the top tier card.... for gaming, but not the top tier chip.
Card VS Chip, once you see it, you can't unsee it. The top tier chip would be the Quadro RTX 8000, and pretty much only because it had an extra 24GB of VRAM on it. I haven't read too much into it, so the only difference would be 24GB vs 48GB VRAM.

So if you didn't know that, now that you do know, you'll realize you should NEVER buy anything less than, for example, the 3090/4090, or perhaps the 3080ti/4080ti, but i'd always aim for the 3090/4090.
As i've said for many years now, buying anything less that these tiers, and you're now wading into murky waters of defective and even damaged silicone, that they've tinkered with to create a "mid-tier" or "low-tier" or "budget" card out of. The fun part is, we'll never know LOL

Anyway, in regards to Ray Tracing, i seen this the other day and thought i'd post it here since we're all talking about it.
https://www.youtube.com/watch?v=SrF4k6wJ-do

By the way, that's a good channel to follow. I only discovered that channel from looking into the claims made from RA Tech, who i only discovered some time last year through questioning the old FX processor benchmarks done by the big tech youtubers. It made me want to dust off all my old motherboards and chips again just to put the issue to rest.

If that means making myself the bad man of youtube by calling those big youtubers out for lying, i will absolutely wear that badge with honour. I despise dishonesty, and more so despise dishonest reviewers because all they're doing is getting paid under the table to sell everyone crappy parts at heavily marked up prices. It costs us end-users both in terms of being unhappy with our purchases, and in terms of winding up with much lighter wallets, and out of this, we're all roped into having to pay up more for less in return.

Here's one of the RA Tech videos for you all: https://www.youtube.com/watch?v=tl_Y4HXqBFQ

I'd love to hear (well, read) the opinions of you all on this subject. I'm happy to go dust off all my old gear and set it all up for a big round of new testing. I have pretty much all the ASUS Crosshair motherboards going back to the Crosshair IV Extreme, as well as ASUS TUF Sabertooth motherboards, including the Sabertooth R3 which was released just before Ryzen and AM4 came out. So i've got plenty to work with.

Oh, as to your NVMEs, in case you didn't know, there's a big sale on Crucial's P3 4TB drives right now.
https://www.amazon.co.uk/dp/B0B25M8FXX/ref=emc_b_5_t

I grabbed one of those yesterday, and will put games like Star Citizen and Elite on there, and any multiplayer games on there too. A 4TB NVME for well under £300?! I'll have one of those thank you!
avatar
PaladinNO: Though with the "RTX 3090" card thrown down, I knew I couldn't really compete. ^^
It's not a competition, just enthusiasts discussing our chosen paths.
I think of it more like people at the park discussing their animals :)

avatar
PaladinNO: Agreed. While I cannot speak for RT in a development scenario (I am not a developer or programmer), I can easily see RT eventually become the full-fledged replacement for light rasterization in games.
Baked lighting will always have its place, both for portability/compatibility and easy performance wins -- and with UE, Unity, and even Godot having built-in mobile export out of the box, it's a strategy that's not going anywhere any time soon.

Just like Djikstra's or the A* algorithm are never going to be a "replacement" for baked navigation maps -- they're different tech that might visually approximate each other, but serve different development goals.

avatar
PaladinNO: I remember LAN-parties. Good times. <3
...
I have never been one for online games - despite my nickname which, no, is NOT from WoW, but from Age of Empires 2 (with "NO" being the country tag) - though I really, REALLY enjoy a good co-op game, or LAN gaming.
Last really decent LAN play I think I had was Grim Dawn, and that's starting to get a little long in the tooth now.

Proud to say I've never spent a single minute with WoW --
though I dropped over 10k hours into the original Guild Wars back in the day, so I have no high-horse here.

Though I can't do the subscription model.
Let me buy the DLCs I want and I'm satisfied.
I'll get around to actually playing as time permits, but paying for time I don't have just "to be able to play" when I'm free isn't a model I'm willing to support. Kept me off EVE for a long time too, and now that they finally went freemium, I just don't care enough to jump into that one.

avatar
PaladinNO: Rest are solely PvE single-player games. Like Cyberpunk 2077, which I've spent like 600 hours on at this point, on my 4 playthroughs, and with so many mods now I fear I have lost track of them all.
I'm still on my first CP77 playthrough, as I rarely find more than a handful of hours a week to play anything -- when I do, my 3rd run through X4 eats up most of that time these days. And that's what actually prompted me to finally pick up Cyberpunk -- X3/X4 are great travel and trade games, but I can't help but miss the Mass Effect "space opera" feel. Cyberpunk scratches that itch great so far.

Simultaneously, on my previous note, X4 also plays like trash on a controller :P
Been spending a lot more game time on kb+m lately than I'd like.

Edit: Aside -- Google "x4 xixas" (or just follow this link to their forums) to see my month-long tear down of why X4's save system is broken... in which I [eventually] wrote an entire mini space sim in C++ from scratch that fully re-implements their save system working the way I believe it ought to (after they said it was impossible). Got a little heated... but I still <3 EgoSoft. I understand, it's not always easy being a dev :)

avatar
MuhEbola: The Titan RTX was not even the top tier chip. It was the top tier card.... for gaming, but not the top tier chip.
Card VS Chip, once you see it, you can't unsee it. The top tier chip would be the Quadro RTX 8000, and pretty much only because it had an extra 24GB of VRAM on it. I haven't read too much into it, so the only difference would be 24GB vs 48GB VRAM.
While I do think about such things, I don't think they're worth mentioning to the average consumer.
As I mentioned, I picked the 3090 FTW3 because, for my purposes, it outperforms the A5000 -AND- can play games at a solid tack -- where "my purposes" require real time rendering in both a high-end game engine and a 3D modeling suite side-by-side.

Saying any chip outperforms another is like saying a hybrid LP/diesel engine outperforms an electric drivetrain...
Well... what's the performance metric?

The average gamer would be well covered by a 2080 --
95%+ games are perfectly well suited (by design) to fall within that spec.
Hell, I'd say a 1070 still covers a solid 80%+

The outliers are designed to intentionally push those boundaries, but the average gamer shouldn't be thinking about such things.

An old friend of mine owns a private racetrack in Australia. I've never found my way out there, but he tells me Ferrari and Lamborghini regularly test prototypes on their track. When I asked what he drives, he said Honda -- because it's easy to maintain and easy to replace. Loves to take his old 911 around the track on rainy days, but when he goes to the store, he drives a Honda (though I also know his wife's in a Porsche Cayenne, but that's beside the point ;)

Computers are just tools. You use the right one for the job.
Gamers don't need Titans or A-series cards anymore than 16 year olds need Ferraris or Hydrosails.
They just simply wouldn't know what to do with them other than crash them :D

avatar
MuhEbola: Anyway, in regards to Ray Tracing, i seen this the other day and thought i'd post it here since we're all talking about it.
https://www.youtube.com/watch?v=SrF4k6wJ-do
This was actually a pretty great little video.
Over the years I think I've had to write three 3D engines from scratch, countless on top of OpenGL-- and hell with DirectX, I won't touch that steaming pile... though I was a significant contributor to wined3d circa 2005 --

::sigh::
tangent... back on track

It's nice to occasionally see someone explain 3D->2D in fairly plain English... without diving into why arctan2 trumps standard sin/cos theta aggregates over a polynomial space, lol

avatar
MuhEbola: I'd love to hear (well, read) the opinions of you all on this subject. I'm happy to go dust off all my old gear and set it all up for a big round of new testing. I have pretty much all the ASUS Crosshair motherboards going back to the Crosshair IV Extreme, as well as ASUS TUF Sabertooth motherboards, including the Sabertooth R3 which was released just before Ryzen and AM4 came out. So i've got plenty to work with.
Crosshair IV Extreme... now you're bringin' back memories, man.
Though I actually toasted the bridge overclocking on that one.
Had better luck with the Rampage IV Extreme -- ran that hard for 8 years as a build machine.
Coincidentally just gave it away over the weekend with an overclocked Intel 3930K to my friend's kid... a 20 year old college student... to use for Slack streaming and Google docs, lol

When I heard "what's virtualization?" I just shook my head and let it go...
Not enough office space to keep it all around.

I kinda envy your collection.
I can't bring myself to hold onto it much past my use if I can find it a new home.
But I do enjoy watching that kinda stuff from someone who has the time :)

I will say it sounds like we have a bit of a disconnect on the TUF series, though.
I've always considered that line bottom-barrel -- you know something I don't?

avatar
MuhEbola: Oh, as to your NVMEs...
Good deal!
But I'm going to... like my preferred mouse configuration... say something blasphemous here.
I've nearly stopped using internal SSDs for gaming. It's just not worth the price (+time) point.

I mean, they're great for booting, and I still keep 'em in the machine for base OS and quick startup --
Eight ~100GB bootable partitions at the moment on my primary.

But for the last year I've been doing all my actual gaming from external 1 or 2TB external Samsung T7 USB-C drives.
It's not so much that they're cheaper, but I've got a stack of 10 on the desk right now that I can swap in at a momen'ts notice for games, backup storage, temporary laptop installs (that's what I'm running Windows from!), etc.
I think we've reached a point with bus speed that it's not really worth even putting drives IN THE MACHINE anymore.
Just my 2 cents.

Will say I haven't enjoyed a GOG chat this much in a while ;)
Post edited January 31, 2023 by xixas
avatar
PaladinNO: Though with the "RTX 3090" card thrown down, I knew I couldn't really compete. ^^
avatar
xixas: It's not a competition, just enthusiasts discussing our chosen paths.
I think of it more like people at the park discussing their animals :)
Huh, I never throught about it like that. That's a really good point, and I like it - not much different from stiffs duscissing their collection of stamps.

I'll be sure to remember that comparison. :)
avatar
xixas: Last really decent LAN play I think I had was Grim Dawn, and that's starting to get a little long in the tooth now.

Proud to say I've never spent a single minute with WoW --
though I dropped over 10k hours into the original Guild Wars back in the day, so I have no high-horse here.

Though I can't do the subscription model.
Let me buy the DLCs I want and I'm satisfied.
I've never touched WoW either, or indeed any game, as you say, with a subscription.
I know the "LEEEEEROOOOY JEEEENKINS!!!!" meme, and I have watched some gameplay (just to confirm it looked as boring as WoW sounded since day 1) and some of the cinematics, but that's where my WoW experience end.

Really like the books though. Blizzard (RIP - Activision can go F or off itself for all I care) really knew how to make a good game story, and I have every StarCraft and Warcraft book novel I could get my hands on, plus some Diablo novels.

I have bought Grim Dawn, as it seemed like a decent and more graphically recent Diablo-ish clone. But I got so confused with all the DLCs and extra downloads and expansions and add-ons!

So I think I settled on the "buy everything to make sure"-mentality on that game...and I haven't even installed the game yet. :|

avatar
xixas: Edit: Aside -- Google "x4 xixas" (or just follow this link to their forums) to see my month-long tear down of why X4's save system is broken... in which I [eventually] wrote an entire mini space sim in C++ from scratch that fully re-implements their save system working the way I believe it ought to (after they said it was impossible). Got a little heated... but I still <3 EgoSoft. I understand, it's not always easy being a dev :)
I treid reading the thread you posted, but it quickly got too technical for me. I had a quick stint with C++ many years ago, but the interest never got enough traction in terms of usefulness for me to pursue further.

Also, almost all posts from that moderator - and I know this is due to the formatting of the forums, but I had to strain my neck with every line he wrote as it filled the whole width of my screen...
I just wanted to yell at him "press Enter and type with more space!" Your posts where much better, with some actual space between each sentence. But I read enough to see your point about it getting a bit heated over there.

Never tried the X-series, I must admit. It looked interesting, but I've only recently - with Cyberpunk, actually - been looking into sim games. As for X4, there briefly was a ~120 USD X4 collectors edition available here on GoG, but it quickly disappeared for some reason.

It's not this one:
https://www.gog.com/en/game/x4_2022_bundle

The one I saw was much more expensive, and according to the site like 75% discounted (at ~120 USD).

avatar
xixas: The average gamer would be well covered by a 2080 --
95%+ games are perfectly well suited (by design) to fall within that spec.
Hell, I'd say a 1070 still covers a solid 80%+

The outliers are designed to intentionally push those boundaries, but the average gamer shouldn't be thinking about such things.

Computers are just tools. You use the right one for the job.
Gamers don't need Titans or A-series cards anymore than 16 year olds need Ferraris or Hydrosails.
They just simply wouldn't know what to do with them other than crash them :D
And that is arguably accurate for any profession. The right tool for the job.
In terms of computers, I don't like the description "future proofing", because there is no such thing (especially not with computers), but I personally prefer buying - when I upgrade - something that will last me at least 4 years, 5-6 if I can manage.

avatar
MuhEbola: I'd love to hear (well, read) the opinions of you all on this subject. I'm happy to go dust off all my old gear and set it all up for a big round of new testing. I have pretty much all the ASUS Crosshair motherboards going back to the Crosshair IV Extreme, as well as ASUS TUF Sabertooth motherboards, including the Sabertooth R3 which was released just before Ryzen and AM4 came out. So i've got plenty to work with.
avatar
xixas: Crosshair IV Extreme... now you're bringin' back memories, man.
Though I actually toasted the bridge overclocking on that one.
Had better luck with the Rampage IV Extreme -- ran that hard for 8 years as a build machine.
Coincidentally just gave it away over the weekend with an overclocked Intel 3930K to my friend's kid... a 20 year old college student... to use for Slack streaming and Google docs, lol

I will say it sounds like we have a bit of a disconnect on the TUF series, though.
I've always considered that line bottom-barrel -- you know something I don't?
avatar
MuhEbola:
A 3930K for Google docs? Well, using the right tools for the job aside, at least one cannot go wrong with overkill. xD

I had an Asus TUF Sabertooth Z77, paired with an Intel i7 3770K. Didn't luck out on the solicone lottery with that CPU; so only managed a mediocre overclock. Built it in 2013 and it worked just as fine when I regrettably sold it in 2019.

I keep mixing up the OC numbers on the 3770K and the previous QX9650, but I think I managed 4.1 GHz all-core max on the 3770K.Anything above and the voltage had to be way too high.

To me, I consider the TUF series to be in the middle. ROG at the top, then TUF, then Prime at the bottom.

avatar
xixas: I've nearly stopped using internal SSDs for gaming. It's just not worth the price (+time) point.
I mean, they're great for booting, and I still keep 'em in the machine for base OS and quick startup --
Eight ~100GB bootable partitions at the moment on my primary.

Will say I haven't enjoyed a GOG chat this much in a while ;)
I agree that using a PCIe NVMe SSD for games is wasteful. If your motherboard is of the last 2 generations and comes with 5+ NVMe slots, sure. The pricing for NVMe drives is starting to come down to the point where you might just buy a NVMe drive over a SATA drive.

But the differences in loading speeds is mostly still negligible. Though it will be interesting to see what DirectStorage can do for game loading speeds in the next couple years. But given the listed SSD speed requirements to benefit from it, if I understand it correctly, I know my NVMes won't cut it.

And you're killing me! Twisting the knfie around while pouring in salt! 8 partitions - and boot partitions at that - on a single drive!? Maybe the Windows Boot Loader is smart enough now, I don't know and I cannot speak for other operating systems.

Is it even stable? I would imagine that it is, else you would probably not have done it like that, but I cannot imagine how.

I tried some triple-booting off a single drive before the whole NVMe thing even was a thing, and it was a stability nightmare! Windows XP, 7 and 98 on separate partitions on a single drive, as I figured the Boot Loader would handle 3x Windows OS' just fine - but no, it seemed random which OS I got every time I restarted the system.

But I've learned, and now prefer to use separate physical drives in a hot-swap cage. A bit jank, aesthetically, but it works like a charm.

Being a hardware and Windows guy for the last 20 years, I feel a little out of my league here when Linux gets mentioned, but yeah, this is a refreshing thread compared the majority on here - such as "why no game work QQQ", with no mention of specs or anything :D
Post edited January 31, 2023 by PaladinNO
avatar
MuhEbola: So i noticed that the reviews have been bombed for this game, obviously a dedicated team are at work on GOG doing this, and i tested it myself and sure enough, i get a down vote already some time between last night when i posted it and this morning.
[...]
Right now the game is at an 3,7/5 or 74/100, which is a pretty fair rating, if a bit overrated even, in my book. As I can not read any good arguments besides "Wäh, me old gamer legend, newb gamers wimps" (note: I'm from the same Era, maybe a couple years later, started with NES/Master System basically) I try to explain via some examples why I think that Cyberpunk is far from a masterpiece and barely above average.

1. Linear Railroaded story.
Speaks for itself. Decisions have extremely little impact on anything and no matter what you choose, you always end up almost precisely where you're supposed to end up. Or to make a comparison, the endings feel like Red, Green, Blue of another big RPG franchise. Their Predecessor work (TW3) did it a lot better, so of course, expectations were high and got underdelivered, which, no matter how good a product might be on its own, if it lacks compared to what came before, people will always be angry to some degree - rightfully so in my opinion.

And the single most contentious point(s) in the game: Remember your first meeting with Jackie after the Scav Dump? You know all about fixers and how they find the cheapest Gonk for the job, then drop 'em in the trashpile? Leading to "The Heist" mission. I would have NEVER: a. Slotted the Relic. b. Went into the same room as Dexter after that bust of a job. c. Even IF I were brainamputated enough to knock on that door, the very first thing I would have done is put a bullet through Olegs' cranium, then send the rest of the mag right after to make sure and keep Dexter in my crosshairs and ask him whether or not he has an escape plan. Shoot him if he utters ANYthing else but a resounding and firm "yes".
But, 'aight, have to get the player into the Story of the Maincharacter somehow.

2. Not being the Protagonist.
Don't kid yourself, you as the player are nothing but a glorified taxidriver for the hero of this crapshoot of a story: Johnny Silverhand. On the one hand, I get it, starpower and all that, on the other hand, I much prefer an RPG to have me as the player in the role of the Protagonist, does not have to be Powerfantasy (MassEffect did that good, Skyrim did it bad) I'm fine with being the scrub who has to earn its name by deeds alone (Morrowind). What I don't like is not mattering at all. If Jackie hadn't pulled the Relic shortly before shuffling off his mortal coil, Silverhand would have overwritten his noggin'

3. Oh my god the bugs https://www.youtube.com/watch?v=8FbCyh9ivBk
I admit, I was mostly spared here, being on PC and all, but I managed to fall through the map a couple of times, I think I stumbled on a soda can or something. And if you look up respective videos, it was atrocious for a lot of people and yes, I know TW3 was a pretty bad shitshow as well when it launched, though I remember it not as bad, my memory might pull a Silverhand here.

The only saving grace and ultimately the thing that pushes it above the midway point are the Gigs and other Sidemissions, that are in part exceptionally well written. So 60/100 from me, which is why I gave it 3 stars out of 5. And this Love/Hate relationship has not changed after 100+ hours in the game. The Main story is abolutely horrible crap, so much so that it deducts points, If Silverhand weren't in the game, I'd give it an 80 after v1.6 easily. If all the sidemissions weren't of the quality they are, purely for the main story, this game would get a 30-40 tops, even after the cleanup of v1.6.

Note: Can't really wrap my head around why this escalated into a spec wank-off, but why not:
I7-4790k & Radeon RX 6700XT but hey, back when it dropped I was still on my R9-290X and it was playable on medium :D
avatar
MuhEbola: So i noticed that the reviews have been bombed for this game, obviously a dedicated team are at work on GOG doing this, and i tested it myself and sure enough, i get a down vote already some time between last night when i posted it and this morning.
[...]
avatar
Amokhunter: Right now the game is at an 3,7/5 or 74/100, which is a pretty fair rating, if a bit overrated even, in my book. As I can not read any good arguments besides "Wäh, me old gamer legend, newb gamers wimps" (note: I'm from the same Era, maybe a couple years later, started with NES/Master System basically) I try to explain via some examples why I think that Cyberpunk is far from a masterpiece and barely above average.

Note: Can't really wrap my head around why this escalated into a spec wank-off, but why not:
I7-4790k & Radeon RX 6700XT but hey, back when it dropped I was still on my R9-290X and it was playable on medium :D
I know you didn't say that i said the game is a masterpiece, but i don't think the game is a masterpiece at all.

My whole thing was the review boming that is verifiably happening. Just look at the reviews yourself, and you'll see, just as i pointed out with my own test of it.

As to the "spec wank off" LOL firstly, but let me just say, this is the difference between gamers and enthusiasts.
That's not a knock to you at all, just that there are people that play games on the hardware they buy, and then there are people who see their hardware as a game it self.

Invite me round to your house one time, i will attempt to undervolt and overclock your CPU, GPU, Memory... Monitor, RGB lighting, Main Lights around your home... the Toilet too, because once i undervolt and OC your cooker, you're going to be making tendies at rates you never imagined possible before, and you'll be using the toilet a lot more as a result LOL
avatar
xixas:
I think it'd be better all round if more people understood the whole Card VS Chip way of seeing things.

From my first Titan, it was clear to me that bolting on more VRAM placed a chip higher up the stack.

780ti - GK110
Titan - GK110

Same GK110, but vastly different levels of end result.
Back then, anyone with a 780ti would be putting that up for sale at, at best, half price, but mostly less than half price.
My Titan however, held a good re-sell value for many years, and long after the 780ti had landed it self in bargain bucket.

That follows suit right up to the last Titan, the Titan RTX. The only two stand out cards, from memory, would be the 1080ti and 2080ti. I got a 1080ti just to see if what was being reviwed was true, and sure enough, it was close enough that you could have happily bought that instead of the Titan. The power of the chip could meet the VRAM. It was just sooooo good.

It's why i've said what i say from that first Titan of mine. Stop buying thoe non-Ti cards, no matter how they are pushed by reviewers (who are often just flat out lying to you because they get paid to) or Nvidia's own marketing (who is also known for lying to you), they're by far not in any way "good" cards in the end, and will see you needing to upgrade more often.

My argument there would be for those buying those low/mid-tier cards, if i was to stick with just buying one Titan instead of two - Buying the low/mid-tier cards will mean you are very likely spending more money than i would have.

The other effect of buying nothing less than the, for example, 3080ti/4080ti, should push companies and in turn wafer manufacturers to increase the "perfect" chips. By 4nm, we absolutely should have mastered this process to the point where you hit a 99% "perfect" chip production rate, but as we can all see, with all these lower tier CPUs and GPUs, that hasn't happened yet.

Now i know you'll always have those anonamly chips, that while "perfect" just won't perform as they should, but are flawless in every other way. So for these anomaly chips, those would then become your lower-tier parts.
For now, we're still in the murky waters of defective chips being tinkered with to produce low-tier parts.

When the vast majority of people out there are buying those lower-tier parts, because 'budget argument', you have this side effect of developers beintg held back a lot because they have to, again for money reasons, program their games to encompass those lower-tier parts as well.

I thought with Crysis, this was the start of developers just blowing people out of the water, forcing us to buy high end all the time, but then it sort of went back to a slow crawl of graphical progression.

At least that's my way of thinking.
Post edited February 01, 2023 by MuhEbola
avatar
xixas: Crosshair IV Extreme... now you're bringin' back memories, man.
Though I actually toasted the bridge overclocking on that one.
Had better luck with the Rampage IV Extreme -- ran that hard for 8 years as a build machine.
Coincidentally just gave it away over the weekend with an overclocked Intel 3930K to my friend's kid... a 20 year old college student... to use for Slack streaming and Google docs, lol

When I heard "what's virtualization?" I just shook my head and let it go...
Not enough office space to keep it all around.

I kinda envy your collection.
I can't bring myself to hold onto it much past my use if I can find it a new home.
But I do enjoy watching that kinda stuff from someone who has the time :)

I will say it sounds like we have a bit of a disconnect on the TUF series, though.
I've always considered that line bottom-barrel -- you know something I don't?
At least back on AM3+, the gist of it was that the differences were merely slightly less USB, less ASUS trinkets in the box, because they all have to meet that AM3+ standard anyway you can rest assured you'll get the same out of your FX8000-series CPUs, and even the 9590.

A noteworthy mention would be the differences in Memory speeds out of the box. On the Formula V Z you have an "upto 2400Mhz" compared to the Sabertooth R2 having "upto 1866Mhz".
But we all know, if you're buying those boards, you're going to be overclocking, and it was doable to get your memory overclocked to 2400Mhz. For more plug and play experience, you'd have got the Formula Z boards since it would be easier to hit that rated 2400Mhz.

To be honest, this was something i didn't really think about back then, because i'm thinking about it now LOL

I know people who ran those FX9000-series chips (myself included) on motherboards that say they can only accomodate parts of a certain Wattage, but stuck in the 220W chips anyway, and they of course worked fine.
Even on their respective CPU support pages, they merely warn that you need liquid cooling.

I have all the Sabertooth boards from that era. Including the Sabertooth R3 which was released just before AM4 and Ryzen. I had to have it! LOL
avatar
MuhEbola: My whole thing was the review boming that is verifiably happening. Just look at the reviews yourself, and you'll see, just as i pointed out with my own test of it.

As to the "spec wank off" LOL firstly, but let me just say, this is the difference between gamers and enthusiasts.
That's not a knock to you at all, just that there are people that play games on the hardware they buy, and then there are people who see their hardware as a game it self.

Invite me round to your house one time, i will attempt to undervolt and overclock your CPU, GPU, Memory... Monitor, RGB lighting, Main Lights around your home... the Toilet too, because once i undervolt and OC your cooker, you're going to be making tendies at rates you never imagined possible before, and you'll be using the toilet a lot more as a result LOL
Hm... I checked just now, actually. Sort reviews by "most recent" looked at the newest 30:

5 Stars - 17 (yours among them)
4 Stars - 4
3 Stars - 1
2 Stars - 3
1 Stars - 5

Nope, can't see reviewbombing goin' on. Oh and most of the 1 stars were rage-rating because something didn't work as expected, the one with the FPS remark stinks of "My PC is 10 years old, Y u no work?" but you have those everywhere in some form or another.

And yeah, I get the "Enthusiast", was one myself, though my Scrooge McDuck concience bonks me on the head every time I think about going for the 3090/6900 option it slaps me hard back to reality, so my inner voice sounds something like this:
"Yo, dumb MFer, ya think getting 5% more performance while paying 200% of the money is efficient, dumba**?"
Aaaand I end up in the (upper) middleclass of cards, which usually deliver the best price/performance options.

As for your OC offer, nah, been there, done that, has become too much hassle and for proper OC nowadays you need a compressor system and that is fairly expensive to set up and frackin' loud. so hard pass ;)
avatar
MuhEbola: My whole thing was the review boming that is verifiably happening. Just look at the reviews yourself, and you'll see, just as i pointed out with my own test of it.

As to the "spec wank off" LOL firstly, but let me just say, this is the difference between gamers and enthusiasts.
That's not a knock to you at all, just that there are people that play games on the hardware they buy, and then there are people who see their hardware as a game it self.

Invite me round to your house one time, i will attempt to undervolt and overclock your CPU, GPU, Memory... Monitor, RGB lighting, Main Lights around your home... the Toilet too, because once i undervolt and OC your cooker, you're going to be making tendies at rates you never imagined possible before, and you'll be using the toilet a lot more as a result LOL
avatar
Amokhunter: Hm... I checked just now, actually. Sort reviews by "most recent" looked at the newest 30:

5 Stars - 17 (yours among them)
4 Stars - 4
3 Stars - 1
2 Stars - 3
1 Stars - 5

Nope, can't see reviewbombing goin' on. Oh and most of the 1 stars were rage-rating because something didn't work as expected, the one with the FPS remark stinks of "My PC is 10 years old, Y u no work?" but you have those everywhere in some form or another.

And yeah, I get the "Enthusiast", was one myself, though my Scrooge McDuck concience bonks me on the head every time I think about going for the 3090/6900 option it slaps me hard back to reality, so my inner voice sounds something like this:
"Yo, dumb MFer, ya think getting 5% more performance while paying 200% of the money is efficient, dumba**?"
Aaaand I end up in the (upper) middleclass of cards, which usually deliver the best price/performance options.

As for your OC offer, nah, been there, done that, has become too much hassle and for proper OC nowadays you need a compressor system and that is fairly expensive to set up and frackin' loud. so hard pass ;)
No, i mean the downvotes on reviews, not the score the reviewer leaves.

There's reviewboming that's people leaving negative reviews, then there's also.... reviewboming, where people bomb your review by downvoting on you.

As to the "5% more performance" 1080/1440/4K, high/very high/ultra?
I can assure you that while playing 4K ultra AND maxxed sliders on the 3090 compared to the 6700XT you said you are using, it's far beyond "5% more performance" even if you were using "5%" as just a general thrown out number for the sake of throwing out a number.

Going your route of buying low-tier cards, you'll find that you will very likely be spending more than i do long term on cards. Same goes with CPUs too.

This is the REAL "price/performance" argument that i make, which of course isn't what the big tech youtubers make, which is why likely bought a low-tier cards because they said "price/performance" and you thought 'i'm sold'.

Sorry my dude, but they lied to you. That's why they're the big tech youtubers, they get paid to shift the trash and leftovers from Intel/Nvidia.

Notice that the youtubers that tell you the truth about Intel/Nvidia get barely any traction on youtube? Again, once you see it, you can't unsee it.

Mine and that other guys 3090's, will see us doing fine at 4K Ultra and maxxes sliders and we can skip a couple or few gens of cards, while you'll probably be on amazon or newegg looking at upgrades by summertime this year.
avatar
Amokhunter: Right now the game is at an 3,7/5 or 74/100, which is a pretty fair rating, if a bit overrated even, in my book.

1. Linear Railroaded story.
Speaks for itself. Decisions have extremely little impact on anything and no matter what you choose, you always end up almost precisely where you're supposed to end up.

And the single most contentious point(s) in the game: Remember your first meeting with Jackie after the Scav Dump? You know all about fixers and how they find the cheapest Gonk for the job, then drop 'em in the trashpile? Leading to "The Heist" mission. I would have NEVER:
A) Slotted the Relic.
B) Went into the same room as Dexter after that bust of a job.
C) Even IF I were brainamputated enough to knock on that door, the very first thing I would have done is put a bullet through Olegs' cranium, then send the rest of the mag right after to make sure and keep Dexter in my crosshairs and ask him whether or not he has an escape plan.
Yeah, the linear parts is what utterly infuriates me at times.
On my last playthrough, I, uh...left Watson before the lockdown lifted (there are methods) and grinded my way up to 20 Body, and carrying several Legendary weapons before I even did The Heist mission with Jackie - and with stats like that, what the actual F is Oleg supposed to be able to do to me that knocks me out with a single punch!?

As for overall rating, I'd give CP77 a 4/5. It's not a 5/5, it WAY too limited to be perfect:
- Pedestrian and driving AI is universally worse than braindead.
- No flying cars (mods doesn't count). Not even self-driving ones. Can't use the Metro (was cut from the game, and again, mods don't count).
- No one in Night City can apparently upgrade a weapon for money. That's apparently globally exclusive to the Engineering stat (hello, what is Wilson actually doing for a living...?).
- As you say, Lifepaths are essentially worthless, other than giving a couple "best outcome for free" throughout the game.
Nomad gets you a car (one of the best, I have to admit), supposedly faster version of Jackie's bike - not that I could tell any difference - and a purchasable car in a different colour.
- Weapon scaling - if you grab a weapon too early, it may only be Rare or Epic at best. And unless you go Engineering, you have to come back later for the Legendary version. Simply because you suddenly gained levels - what, did someone randomly remove the previous gun and put Legendary one in its place?

Where is, for example, "I am Corpo - I can buy anything"? Or Nomad's increased weapon base stats due to inherent improved tinkering skills? Or Street Kid's improved melee / fighting stats from growing up in a tough neighbourhood?

avatar
Amokhunter: Note: Can't really wrap my head around why this escalated into a spec wank-off, but why not:
To enthusiasts, a proverbial gauntlet is thrown when the numbers "30" and "90" is uttered without any grammatical separation in a computer setting.

But I can see how that is a ridiculous game to non-players, no offense. I am only a part-time player myself, as my focus is first and foremost on system stability over performance, with my focus first on the power supply and then motherboard - and the rest as needed. Which is also why I don't do watercooling and instead spend 180 EUR on case fans (6x Corsair ML140 - set up a proper fan curve in the BIOS and you got quiet when you want it and performance when you need it).

You should be able to OC the snot out of that 4790K though, with sufficient cooling. It's a good chip.

avatar
MuhEbola: Mine and that other guys 3090's, will see us doing fine at 4K Ultra and maxxes sliders and we can skip a couple or few gens of cards, while you'll probably be on amazon or newegg looking at upgrades by summertime this year.
Both you and the other guy, unless I missed something, uses the extra VRAM for actual work. Rendering and the like. That puts it on another level, and changes the "want" more into a "need". The fact that both of you can also max all the sliders in games is arguably just a by-product.

Pricewise, my RTX 3080 was expensive as hell for what it is. Personally, I could probably have made do with my previous GTX 1080 for another year or 2, but I "wanted" a new GPU and decided I didn't "need" the 3090.

Ironically, had I waited those 1-2 years, I could have had the RTX 3090 Ti for almost the same price as what I paid for the 3080. Hindsight 101. :|
Post edited January 31, 2023 by PaladinNO
I gotta say I had one cool moment about raytracing during my playthrough. I was in some gang hideout and I was going up an escalator. The escalator had a glass side, and when I was on the top I thought I saw an enemy through glass and I shot at him but it was in fact a reflection. The guy was behind me!
Post edited January 31, 2023 by Mika1
avatar
MuhEbola: So i noticed that the reviews have been bombed for this game, obviously a dedicated team are at work on GOG doing this, and i tested it myself and sure enough, i get a down vote already some time between last night when i posted it and this morning.

This is just petty. Very petty.

If possible, the votes on reviews should be reset.

If gamers can't even be trusted to provide HONEST feedback to help their fellow gamers, then perhaps the voting system it self should simply just be abandoned.

We already have enough dishonesty with the big reviewers out there, without having it from gamers themselves.

The complaints i seen about this game at launch, were something i had seen endless times before across the whole board, and on various systems, so it's not something new. Yet for some reason, this game seems to get a trashing job as if it's No Man's Sky at launch.

I just don't get it. Maybe i'm missing something. However, as someone that has played games since the Atari 2600, and been into computers and consoles since then, i have basically seen it all. This games launch problems and issues were by far no where near as bad as the hatred the game got made it out to be.

Perhaps my problem is that i have an attention span, and can manage my emotions, or better yet, don't get emotional when a game doesn't work as i expected it to. All of the older gamers i know, i remember the common talking point among us was "Yeah i'll just come back to this in a few months and finish off some of the other games on my list".

For the TL:DR generation though, they want it all NOW and they want it their way RIGHT NOW, otherwise the whole world should burn down because they couldn't get their way.
I thought it was only toddlers that stamped their feet and screamed like that, and we're all supposed to grow out of that, but i can see this sadly hasn't happened for many.

/rant
Why exactly is it "review bombing" just because a bunch of people were upset about something and leave negative reviews? Just because there's a lot of them and they have a different opinion than you?

I would agree that Cyberpunk 2077 is a "good" game overall (not classic or even great), but I don't agree with the idea that player reviews should be disregarded, edited, or dismissed just because you don't think they're reasonable.

It's funny to see people act like they're the more intellectually mature person in a dispute when they try to dismiss other people's opinions as entirely invalid just because they don't like them.

And tbh I think player reivews tend to be much more accurate than game journalist reviews.

By the way, a lot of people out there consider it "review bombing" if a lot of game buyers leave reviews complaining about a controversial DRM method used in the game because the review is not strictly related to the game's content. Sound good to you?

The people pushing this "review bombing" idea are probably corporate game publishers who want companies like Steam to edit or limit reviews because they view that as preferable to actually trying to make a product those demanding consumers want. It's an anti-consumer way of thinking, much like DRM. I think people should be able to write literally whatever they want in reviews for games they pay for, and anything less than that diminishes my trust in the review system.

I think the current player review score on GOG is pretty close to reasonable. If absolutely all of the issues with the game were fixed, I might rate it 85/100 personally. With a few of the small issues factored in, I would maybe give it an 80/100... which isn't a bad score at all. And indeed, the GOG review score is close to that at 3.7/5, which is equivalent to 74/100 and thus pretty close to my imagined review.
Post edited February 01, 2023 by temps
avatar
temps: Why exactly is it "review bombing" just because a bunch of people were upset about something and leave negative reviews? Just because there's a lot of them and they have a different opinion than you?

I would agree that Cyberpunk 2077 is a "good" game overall (not classic or even great), but I don't agree with the idea that player reviews should be disregarded, edited, or dismissed just because you don't think they're reasonable.

It's funny to see people act like they're the more intellectually mature person in a dispute when they try to dismiss other people's opinions as entirely invalid just because they don't like them.

And tbh I think player reivews tend to be much more accurate than game journalist reviews.
I think you missed the point here. The referred "review bombing" are the people who are just pressing dislike on any positive review of the game, just because they themselves don't like the game and want to "hate" on everyone who do like it.

How is that for intellectual maturity.

Personally, I read the negatie reviews first if I consider a game, and if I for some reason don't want to go in "cold" (i.e. unbiased) like I usually do. I sometimes want to see what people think is wrong with the game - though I instantly skip a review if it is obvious this is just someone who is hating because they don't have the PC horsepower to properly play the game with the available graphics.

I also think Cyberpunk 2077 is a "good" game. Subjectively, I might even call it a great one for the graphics and all that it offers in terms of gameplay, even if that objectively maybe isn't so given all its flaws and bugs. That's a downside of modding - if the game crashes because I installed mods to get features that I think should already be in the game, but aren't, I cannot blame the game anymore if it starts crashing as a result of the mods.
avatar
temps: Why exactly is it "review bombing" just because a bunch of people were upset about something and leave negative reviews? Just because there's a lot of them and they have a different opinion than you?

I would agree that Cyberpunk 2077 is a "good" game overall (not classic or even great), but I don't agree with the idea that player reviews should be disregarded, edited, or dismissed just because you don't think they're reasonable.

It's funny to see people act like they're the more intellectually mature person in a dispute when they try to dismiss other people's opinions as entirely invalid just because they don't like them.

And tbh I think player reivews tend to be much more accurate than game journalist reviews.
avatar
PaladinNO: I think you missed the point here. The referred "review bombing" are the people who are just pressing dislike on any positive review of the game, just because they themselves don't like the game and want to "hate" on everyone who do like it.

How is that for intellectual maturity.
If we go by what's in the OP's original comment, I think it's open to interpretation whether he meant review bombing of the actual game in reviews vs review bombing of positive reviews in the reviews for that game.

And considering "review bombing" is usually used to refer to large numbers of people leaving negative reviews on a product rather than on positive reviews for the product, I don't think my interpretation is unreasonable even if someone perhaps moved the goalpoasts later on in the thread.

I don't see how it could be called "review bombing" if people who disagree with a players positive review mark it down. That's actually what they are supposed to do... mark down reviews that they consider unhelpful.
avatar
PaladinNO: I think you missed the point here. The referred "review bombing" are the people who are just pressing dislike on any positive review of the game, just because they themselves don't like the game and want to "hate" on everyone who do like it.

How is that for intellectual maturity.
avatar
temps: If we go by what's in the OP's original comment, I think it's open to interpretation whether he meant review bombing of the actual game in reviews vs review bombing of positive reviews in the reviews for that game.

And considering "review bombing" is usually used to refer to large numbers of people leaving negative reviews on a product rather than on positive reviews for the product, I don't think my interpretation is unreasonable even if someone perhaps moved the goalpoasts later on in the thread.

I don't see how it could be called "review bombing" if people who disagree with a players positive review mark it down. That's actually what they are supposed to do... mark down reviews that they consider unhelpful.
First sentence of my OP -
"So i noticed that the reviews have been bombed for this game, obviously a dedicated team are at work on GOG doing this, and i tested it myself and sure enough, i get a down vote already some time between last night when i posted it and this morning."

As i said to another commenter here -
"There's reviewboming that's people leaving negative reviews, then there's also.... reviewboming, where people bomb your review by downvoting on you."

When you look at the reviews, and see a concentrated effort to go around disliking peoples reviews, that's reviewbombing.

Another name i have seen for this act, in recent times, would be called "Ratioed". But that's new-ish to me.