Fun, spec shares.
Every "back in my day" gaming progression conversation ends up here eventually :)
I went the AMD route for this build as well -- the performance per dollar spent is just too good to pass up.
Nearly 20% lower TDP than comparable Intel chips, and at the time I was able to pick it up for nearly half the price of Intel's headliners.
--
PC
• AMD Ryzen 9 5950X
• EVGA GeForce RTX 3090 FTW3 Ultra Gaming, 1800 MHz, 24 GB GDDR6X
• 128 GB (4x32) Corsair Vengeance DDR4 3200 MHz, CL16
• ROG Crosshair VIII Dark Hero motherboard
• 2x 2 TB Samsung 980 Pro PCIe 4.0 Gen 4 NVMe M.2
• 1000 W Seasonic PRIME Ultra Titanium PSU
• 3x Noctua NF-A14 PWM 140mm front intake
• EK-AIO Elite 360 liquid cooler (6x 120mm fans, top exhaust)
• 1x Fractal Silent Series 140mm rear exhaust
--
Display
• 3x LG Ultragear 32" QHD (2560x1440) 165 Hz
--
Input -- just because gamers oft consider my setup blasphemous, lol
• Elecom HUGE Trackball
• Microsoft Trackball Explorer 1.0 (when I'm feelin' the old school ergo itch)
• Wacom Intuos Pro M pen/tablet
• DualShock 4 controller
• XBox 360 controller
... no "normal" mice (I like my wrists, thanks)
I do keep one on the edge of the desk, though, to be kind to others
--
Liquid cooling's fully dedicated to the CPU, both because my compile jobs regularly push it up to ~98°C even with all that cooling and because I don't want the hassle of dealing with the water block when I swap video cards.
Edit / Addition: While I know everyone raves about the 5800X3D for gaming, when it comes to -building- games, the extra 16 compile threads make a world of difference in the 5950X's favor Daily driver OS is Manjaro, but I also keep Pop_OS! and Linux Mint installed for testing.
There's a separate laptop (ROG Zyphyrus, RTX 3070) on Windows half-time for when I absolutely must test on that horrid operating system, heh, and then an M1 MacBook Pro and a couple Mac Mini's for macOS/iOS builds/testing.
PaladinNO: If you're doing render work, the 3090 was a no-brainer at the time! That's actually something where the MSRP of the 3090 could be defended.
Indeed :)
Mainly, though, while I very much do use all that onboard RAM, it's about the over 10K CUDA cores --
It outperforms a Quadro A5000 for my render needs for $2000 less.
...really wanting a 4090 now, though, to play with some of those new AI features!
MuhEbola: But, ray tracing still is a gimmick, and i haven't bothered with it much at all, except just to see the differences now and again in some games.
Can't agree with you on that one. Ray tracing's fantastic.
I think most devs just haven't gotten a handle on how to best use it in their games yet.
But the newer Cycles Optix renderer in Blender with CUDA+RT enabled is amazing -- improved my realtime lighting workflow times literally 10 fold.
But RT purely for games?
I'll agree, that's another one on the "not yet worth the money" pile --
though it's not like you have much choice with Nvidia going full RTX.
MuhEbola: When everyone was scrambling to grab the 2xxx-series RTX cards, i grabbed two Titans as i always have, and even with both, ray tracing was just a gimmick. That's TWO Titan RTX cards mind you.
...
DLSS is nothing more than a life saver for those still using potatoes for computers.
I believe you commented in another thread that, unfortunately, SLI is super dead at this point -- and since it never worked right on Linux to begin with, I just ignore it. Though I have set up rigs in the past with multiple non-crossed cards to batch render jobs and/or allow single-system LAN parties, splitting the X sessions and locking multiple keyboards/controllers to their associated sessions/screens.
Remember LAN parties?
Remember when people actually wanted to game in the same room together?
I'm never going to understand the always-online gaming trend.
If it isn't couch co-op, I'm not playing with you.
I'm old enough that co-workers cringe when I say things like "I prefer phone calls to text messages"
Meh, is what it is, lol
DLSS is an interesting one. I think it's a good little bit of tech to have on hand as your card ages to keep playing the newest titles.
CP77's actually the first game I've ever -had to- enable it on to get the experience I want, though.
Sure, that upscaling's technically muddying the textures a bit, but due to the vivid and mostly flat color pallet and the game's quick pace, I barely even notice, and am happy to have the extra ~20 fps on average.
Of course, I'm limiting it a bit and maintaining an average card temp just under 70°C.
I've got real work for the card, I'm not trying to toast it (or heat the office) in my free time.
MuhEbola: The big one for me though, and as you'll already know by now as well, very few games are 'strolling simulators'. That for me is why ray tracing is pure gimmick.
This is a good point.
I've never really been into first person shooters, and that includes CP77. I'll throw the occasional knife, but the rest is all hacking. All those shiny guns go straight to the wider retail market :P
But, while walking simulators make up a very small portion of the gaming market, I really enjoy them, and with companies like Cyan getting back in the game, there might be some very good RT uses over the next few years in that space.
Real-time "god rays" in rural RPGs and fog-dense space sims also add a really nice touch for immersion's sake, and that goes double for VR environments.
But then there are also the "mathy bits" -- RT tech isn't limited to visible light.
It can be used for all sorts of wide-spectrum ray casts -- gauge object distance for millions of nodes, multi-point navigation, complex collision prediction, etc.
Developers have really just started scratching the surface of dedicated hardware ray-tracing's potential now that it's become widely available. Give it 5-10 years for everyone to catch up :)
PaladinNO: I've never cared about the graphics in a game - I value gameplay and story and lack of bugs way higher than how a game looks. Which is why I still greatly enjoy games like Half-Life 2 and Portal 2.
Very much this.
While I enjoy a talented artist, if the mechanics aren't there, the game's going in the trash.
I'm still playing HoMM3. Don't care if it's going on 25 years old, it's still one of the best in the genre.
Age of Wonders III is great too, but not just because "ooo 3D".
Regarding said quality, given the forum we're in and the OP topic, for as engaging as I've found Cyberpunk so far, whether it was stated or not, it's an obvious Sleeping Dogs clone that tried to address the common player complaint "why so few guns?" -- and in that light, I gotta say, for all the things they got right, some things are significantly lacking.
Notably, given the legacy, I'd have appreciated a little more (any?) effort put into hand-to-hand combat, and the driving is holy-omg-hell terrible.
Also, I don't know what's up with controller support, but it just feels off. I play most first-person titles with either the DS4 or Xbox controller, but it just doesn't play well in CP77. It feels really "jerky" -- Not sure if it's a smoothing issue or maybe some calibration problem going through the SDL/Wine layer -- but if it's config, it's not affecting any other titles, and KB+M plays fine. Is it just me, or does anyone else notice controller play feeling wonky?