It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
So i noticed that the reviews have been bombed for this game, obviously a dedicated team are at work on GOG doing this, and i tested it myself and sure enough, i get a down vote already some time between last night when i posted it and this morning.

This is just petty. Very petty.

If possible, the votes on reviews should be reset.

If gamers can't even be trusted to provide HONEST feedback to help their fellow gamers, then perhaps the voting system it self should simply just be abandoned.

We already have enough dishonesty with the big reviewers out there, without having it from gamers themselves.

The complaints i seen about this game at launch, were something i had seen endless times before across the whole board, and on various systems, so it's not something new. Yet for some reason, this game seems to get a trashing job as if it's No Man's Sky at launch.

I just don't get it. Maybe i'm missing something. However, as someone that has played games since the Atari 2600, and been into computers and consoles since then, i have basically seen it all. This games launch problems and issues were by far no where near as bad as the hatred the game got made it out to be.

Perhaps my problem is that i have an attention span, and can manage my emotions, or better yet, don't get emotional when a game doesn't work as i expected it to. All of the older gamers i know, i remember the common talking point among us was "Yeah i'll just come back to this in a few months and finish off some of the other games on my list".

For the TL:DR generation though, they want it all NOW and they want it their way RIGHT NOW, otherwise the whole world should burn down because they couldn't get their way.
I thought it was only toddlers that stamped their feet and screamed like that, and we're all supposed to grow out of that, but i can see this sadly hasn't happened for many.

/rant
Post edited January 25, 2023 by MuhEbola
Amen.

Good Old Gamer here too -- grew up on the Atari, C64, and NES.

Due in large part to the full mocap and voice acting, I've found CP77 to be the most presentationally engaging game since the Mass Effect series.

It's trendy to hate things today.
Just ignore them ;)
avatar
MuhEbola: ...Yet for some reason, this game seems to get a trashing job as if it's No Man's Land at launch.
Think you meant No Man's Sky here :P
avatar
xixas: Amen.

Good Old Gamer here too -- grew up on the Atari, C64, and NES.

Due in large part to the full mocap and voice acting, I've found CP77 to be the most presentationally engaging game since the Mass Effect series.

It's trendy to hate things today.
Just ignore them ;)
avatar
MuhEbola: ...Yet for some reason, this game seems to get a trashing job as if it's No Man's Land at launch.
avatar
xixas: Think you meant No Man's Sky here :P
I surely did. Thanks for the correction. Never played that one though.
To add to your initial points, CP77's also a new IP.
Speed bumps are to be expected with new titles.

Not to jump into the pool, but if anyone's going to give any developer hell at launch, pretty sure we should be saving that for Bloodlines 2 -- if it ever reaches launch, that is XD
@ xixas and @ MuhEbola
If I may bump in here for a moment, you both have RTX 3090 cards, is that correct?

If so, and assuming you're playing with raytracing ON (and probably DLSS disabled), I am curious how CP77 looks with RT enabled in dark areas. Specifically, in the abandoned building where you can save Takemura during the Search And Destroy-mission.

I got a Gigabyte RTX 3080 Aorus Xtreme 10 GB (didn't see the point in the 3090 and 24 GB VRAM at the time of purchase for my use) and I have tried playing around with the raytracing settings in a basically pitch black environment.

And honestly, while I cannot say with certainty it wasn't my game settings or any limitations with the RTX 3080, but being somewhat of a flashlight enthusiast so I know how various lights reflects off of surfaces and actually looks to the human eye, I was SO DISAPPOINTED how raytracing works in dark areas CP77. It doesn't look natural at all!

In this building there are multiple flood lights that can be turned on, and assuming those are typical work lights and around 10.000 Lumen each...for one, the light spread of work lights would be much greater, but secondly the light would in real life bounce back off of the walls and to the actual human eye illuminate almost the whole room.

This doesn't happen at all in the game that I can tell - in CP77, with RT enabled, those work lights only give a thin beam with almost zero light bounce back. Almost ironically, with the narrow light beam portrayed in the game, the bounce back from walls would be even greater.

With RT disabled, and using only the built-in rasterization, the game actually looks much more realistic as the light spreads more evenly around a larger area (and the lack of light bounce can be forgiven). Not to mention the game performing so much better with RT disabled!

So my question is, with an RTX 3090, RT enabled and DLSS disabled, does the lights in this specific area of the game look different? Does it look realistic with how light actually works with the human eye?

Personally, from what I have seen - though I cannot speak for the 40xx series - I say raytracing still is a pointless gimmick.
Post edited January 26, 2023 by PaladinNO
86 metacritic is spot on if asked from me. Cyberpunk 2, 90+.
Post edited January 26, 2023 by CyberBobber
avatar
PaladinNO: @ xixas and @ MuhEbola
If I may bump in here for a moment, you both have RTX 3090 cards, is that correct?

If so, and assuming you're playing with raytracing ON (and probably DLSS disabled)...
While I am playing on a 3090 FTW3, I didn't even bother enabling RT (CP77's got notoriously terrible performance ray-traced) and I have DLSS set to performance. It's a great card, but I'm running triple-head at 7680x1440, and even with optimizations I can't get much above 60 fps w/ Ultra settings across 3 monitors (with external vsync).

As to the card's vram, the 24 GB onboard is useless to the average gamer -- you made the right choice going with a 3080. There's like a 5% performance gain for an extra $1000 -- not worth it.

But to a dev like myself, that extra vram is worth every penny. I've gotta be able to keep fulls scenes up in Blender and UE at the same time, and those uncompressed textures hit the card hard. Also handy when I'm testing security suites and attempting cracks against our password stores (entirely legal paid security auditing). So it's really just a matter of what you need out of the hardware.
avatar
xixas: While I am playing on a 3090 FTW3, I didn't even bother enabling RT (CP77's got notoriously terrible performance ray-traced) and I have DLSS set to performance. It's a great card, but I'm running triple-head at 7680x1440, and even with optimizations I can't get much above 60 fps w/ Ultra settings across 3 monitors (with external vsync).
Interesting. Sounds like the 3080 and the 3090 performs about the same in CP77 then. Depending on the settings, obviously. Still, I thought CP77 was better optimized than that now by now.

And I also play at 7680x1440, with 3x 27" HP Omen Z4D33AA 165 Hz in Nvidia Surround, though I have G-sync disabled (some older games, like Crysis, did NOT like G-sync!).

My CP77 FPS at 7680x1440 is down to around 45 in heavily crowded areas, but up to ~90 when driving in the desert areas. Am playing with both RT and DLSS disabled, and on custom High settings with reduced shadows (motion blur and all that other thrash also very much disabled).

If you're doing render work, the 3090 was a no-brainer at the time! That's actually something where the MSRP of the 3090 could be defended.
Meanwhile, for comparison, my "need" for more VRAM...is because I like to keep Youtube tabs open for later perusal. :/
Which I've learned, with Firefox hardware accelleration enabled, really likes VRAM.

My full specs:
Windows 10 Pro

Be Quiet 802
Be Quiet Dark Rock 4, custom white painted
Asus Maximus XII Hero Z490
i9 10900K, air cooled @ 5.00 GHz / 1.235 Volt (XTU)
Gigabyte Aorus Xtreme RTX 3080
Corsair Vengeance RGB Pro 3200 MHz 64 GB CL16
Adata XPG SX8200 Pro 512 GB NVMe (OS)
Samsung 970 Evo Plus 2 TB NVMe (Games)
Adata XPG SX8200 Pro 2 TB NVMe (OBS)
Samsung 860 Pro 512 GB SSD (LAN upload disk)
Seagate Exos 7E8 4 TB HDD (Data)
WD Black 2 TB HDD (.torrents)
Corsair HX1200
Asus XG-C100C 10 Gbit NIC
Asus Xonar Essence STX soundcard

Steelseries Sensei 310
Steelseries QcK Hard mouse pad
Cooler Master Storm Skorpion mouse bungee
Ducky One 2 Full White MX Brown with 40A dampening rings

Got 4 monitors hooked up to the GPU:
3x HP Omen Z4D33AA for games, work and multitasking.
Sony Bravia X90J 55" 120 Hz for entertainment (got 72 TB of Seagate Exos storage on my 10 Gbit LAN home server).

And a vertically mounted Samsung S32A700 (32", 4K) hooked up to the iGPU.
Only does 30 Hz at 4K (because the Intel HD630 only supports HDMI 1.2...), but I'm only using it for my email.

And my phone is a Nokia 3720 (no joke) because I funnel most of my paychecks into my PC. ^^
Post edited January 28, 2023 by PaladinNO
avatar
PaladinNO: @ xixas and @ MuhEbola
If I may bump in here for a moment, you both have RTX 3090 cards, is that correct?

If so, and assuming you're playing with raytracing ON (and probably DLSS disabled), I am curious how CP77 looks with RT enabled in dark areas. Specifically, in the abandoned building where you can save Takemura during the Search And Destroy-mission.

I got a Gigabyte RTX 3080 Aorus Xtreme 10 GB (didn't see the point in the 3090 and 24 GB VRAM at the time of purchase for my use) and I have tried playing around with the raytracing settings in a basically pitch black environment.

And honestly, while I cannot say with certainty it wasn't my game settings or any limitations with the RTX 3080, but being somewhat of a flashlight enthusiast so I know how various lights reflects off of surfaces and actually looks to the human eye, I was SO DISAPPOINTED how raytracing works in dark areas CP77. It doesn't look natural at all!

In this building there are multiple flood lights that can be turned on, and assuming those are typical work lights and around 10.000 Lumen each...for one, the light spread of work lights would be much greater, but secondly the light would in real life bounce back off of the walls and to the actual human eye illuminate almost the whole room.

This doesn't happen at all in the game that I can tell - in CP77, with RT enabled, those work lights only give a thin beam with almost zero light bounce back. Almost ironically, with the narrow light beam portrayed in the game, the bounce back from walls would be even greater.

With RT disabled, and using only the built-in rasterization, the game actually looks much more realistic as the light spreads more evenly around a larger area (and the lack of light bounce can be forgiven). Not to mention the game performing so much better with RT disabled!

So my question is, with an RTX 3090, RT enabled and DLSS disabled, does the lights in this specific area of the game look different? Does it look realistic with how light actually works with the human eye?

Personally, from what I have seen - though I cannot speak for the 40xx series - I say raytracing still is a pointless gimmick.
Well my system has - 5800X3D + ASUS TUF 3090 OC + 32GB (2x16) 3600Mhz CL14
and i prefer to just play my games at 3840x2160 at ultra with sliders maxxed out, which gives me at the least 60fps in all my games, with lovely smooth frametimes due to the high L3 cache.

But, ray tracing still is a gimmick, and i haven't bothered with it much at all, except just to see the differences now and again in some games.

When everyone was scrambling to grab the 2xxx-series RTX cards, i grabbed two Titans as i always have, and even with both, ray tracing was just a gimmick. That's TWO Titan RTX cards mind you.

With the 3090, i can honestly say i really don't notice any improvement, except in perhaps FPS.

As i said from the start, if you have to switch on DLSS just to have a playable framerate in order to have RTX on, then you're conning yourself.
As to DLSS, in my opinion, anyone buying modern cards should NOT consider it when trying to make their minds up about which tier card to get. DLSS is nothing more than a life saver for those still using potatoes for computers.
Anyone on high or enthusiast cards using it is conning themselves.

Even if the 4090 can do ray tracing maxxed out at 4K (well, not 4K but rather UHD 3840x2160) ultra quality settings with all the other fancy sliders maxxed out, at at least a solid 60fps, it's still a gimmick. Ask yourself, after paying out well over 1200 for such a GPU, then paying extra on your electricity bill running it, is it all really worth it in the end? It's just reflections and shadows, right?

Same as hair works, and all the other little Nvidia carrots at the end of their stick.

The day ray tracing costs nothing, and you can easily run your games at UHD resolution with everything maxxed out quality wise, and doesn't require 1.21 Gigawatts to do so, then and only then, will it become less of a gimmick, but only slightly less.


The big one for me though, and as you'll already know by now as well, very few games are 'strolling simulators'. That for me is why ray tracing is pure gimmick.

No one is really going to stop just to 'ooh' and 'ahhh' at reflections and shadows. We got that out of our systems playing Resident Evil on the Playstation 1 back in the day, when you moved your character in front of a mirror in one of the rooms, and had real-time reflections active in the scene. (Well, real time for what the Playstation 1 was capable of any way, as there was always delay built in, except for the Move Wand being developed in conjunction with the Eye Toy, which was legit real-time as i recall on the PS1).

The novelty at advancments in these sorts of effects as games improved over the years also wore off pretty much as soon as you walked away from whatever the effect was.

Ray tracing is fools gold. Don't buy GPUs for it, but also don't buy anything less than the mid-tier GPUs regardless anyway, as any lower than that, and even in the case of some mid-teir GPUs, it's going down the route of defective silicone. I know that can upset people when i inform them that their close to 1000 GPU might have been called a 3070ti/4070ti because the silicone either wasn't up to par, or had defects. It's why i always either bought Titans, Enthusiast cards etc.
avatar
MuhEbola: Well my system has - 5800X3D + ASUS TUF 3090 OC + 32GB (2x16) 3600Mhz CL14
and i prefer to just play my games at 3840x2160 at ultra with sliders maxxed out, which gives me at the least 60fps in all my games, with lovely smooth frametimes due to the high L3 cache.

But, ray tracing still is a gimmick, and i haven't bothered with it much at all, except just to see the differences now and again in some games.

With the 3090, i can honestly say i really don't notice any improvement, except in perhaps FPS.

As i said from the start, if you have to switch on DLSS just to have a playable framerate in order to have RTX on, then you're conning yourself.
As to DLSS, in my opinion, anyone buying modern cards should NOT consider it when trying to make their minds up about which tier card to get. DLSS is nothing more than a life saver for those still using potatoes for computers.
Anyone on high or enthusiast cards using it is conning themselves.
I don't have any personal experience with AMD since the Athlon XP days, but from what I have seen on the charts, you got a damn kick-ass gaming PC!

And it sounds like you are very much of the same mindset as me.

I must say I find it rather ironic - Nvidia invented raytracing to give gamers "better" lighting effects in games. Then, since RT all but kills the overall performance, they invented DLSS as well to downscale and make everything else look worse to compensate.

Having played around with both RT and DLSS (in CP77 specifically), I just turn both things off and play with regular rasterization effects (which has become rather damn good with time with regard to lighting effects) and game object quality as close to as the game developer intended.

And playing around with other game settings to get the performance I deem acceptable, while having it look as good as possible.

So what Nvidia basically is saying is "we don't have graphics cards that are good enough yet", since they cannot handle a game like Cyberpunk 2077 maxed out, with raytracing, without a massive hit to the performance. The RTX 4090 is arguably close, but I for one will not sell my car to buy a GPU just to get my gaming fix.

Though I must admit the Gigabyte RTX 4090 Aorus Master has me drooling a bit. Shame I would have to give up on my vertically mounted Asus Essence STX sound card to fit it in my case.

Personally, I am very picky about the games I play. And I have little to no desire for the latest and greatest titles (as far as I am concerned, games peaked around 2010, before this "always online" DRM nonsense got everywhere), so I typically buy a new GPU every 2 generations to keep playing "everything" at acceptable framerates, and usually buy it when after the new generation has already been established for the best pricing.

I've never cared about the graphics in a game - I value gameplay and story and lack of bugs way higher than how a game looks. Which is why I still greatly enjoy games like Half-Life 2 and Portal 2.
Post edited January 27, 2023 by PaladinNO
avatar
MuhEbola: Well my system has - 5800X3D + ASUS TUF 3090 OC + 32GB (2x16) 3600Mhz CL14
and i prefer to just play my games at 3840x2160 at ultra with sliders maxxed out, which gives me at the least 60fps in all my games, with lovely smooth frametimes due to the high L3 cache.

But, ray tracing still is a gimmick, and i haven't bothered with it much at all, except just to see the differences now and again in some games.

With the 3090, i can honestly say i really don't notice any improvement, except in perhaps FPS.

As i said from the start, if you have to switch on DLSS just to have a playable framerate in order to have RTX on, then you're conning yourself.
As to DLSS, in my opinion, anyone buying modern cards should NOT consider it when trying to make their minds up about which tier card to get. DLSS is nothing more than a life saver for those still using potatoes for computers.
Anyone on high or enthusiast cards using it is conning themselves.
avatar
PaladinNO: So what Nvidia basically is saying is "we don't have graphics cards that are good enough yet", since they cannot handle a game like Cyberpunk 2077 maxed out, with raytracing, without a massive hit to the performance. The RTX 4090 is arguably close, but I for one will not sell my car to buy a GPU just to get my gaming fix.

I've never cared about the graphics in a game - I value gameplay and story and lack of bugs way higher than how a game looks. Which is why I still greatly enjoy games like Half-Life 2 and Portal 2.
I never cared about the graphics in a game either, coming from sitting there as a child in front of the TV playing my uncles Atari 2600, i think Centipede was my first game, or at least the first i remember, and Breakout too.

Then one day, i did care about graphics. I might be called a graphics snob, but bear with me here....
The reason why i love putting the resolution and quality settings up as high as possible and reasonable, is purely because all of that helps with immersion. When you have the immersion nailed, now the experience you're having in terms of "gameplay" is at its absolutely finest. So for me, i don't see graphics and gameplay and different entities, but rather, one in the same, as they compliment and help better each other. Like a man and a woman.

I remember getting Battlefield 4 with my 7850K (remember the PS4 console killer build challenge? LOL).
At 1920x1080 i think i was able to play that at around 25 or 26 to 30fps with high settings. I loved it.
Multiplayer was an obvious no-go at anything more than 1280x720 loest possible settings, even with the highst overclocks on the iGPU you could manage, but again, i enjoyed it, despite getting totally destroyed.
Once i seen what i needed to, i was straight back onto my big AM3 system. Not sure i had the 9590 or 8370 installed at that time, but i could at least compete, and get top of the board, in Battlefield with it! LOL

Now ask yourself, have you ever dusted off an old game, just to play it again with graphics overhaul mods?
The answer is probably a Yes! and for the same reasons as i listed above.

Well the 4xxx-series, like the 3xxx-series, are "good enough" already, if you're willing to play at 1080 or 1440p.
You don't even have to use DLSS either at those resolutions. However, if we're honest with ourselves, sitting on our comfy sofas 6 to 7ft away from our 60+ inch TV screens with a controller in our hands, we absolutely want our games at 4K (well, 3.8K LOL).
At least i know i do.

Since XP, when those "HD READY" TVs hit the market (720p LOL) i got one with a VGA input on the back, i think it was something like 38". I have been sitting back all comfy and cosy on my big soft plush recliner gaming that way.

I totally hated sitting at the desk with a monitor, hunched over like an old man. I'm never going back to that sort of set up.

This for me, was the era where my work machine now also doubled as my game machine. The Playstation wasn't being left out though.
That's where i got it from, with the controller with the long cable, giving me the ability to sit back on my chair comfy and game my weekends away when not working. Why couldn't i also do that with the PC i thought. Never looked back since.
Fun, spec shares.
Every "back in my day" gaming progression conversation ends up here eventually :)

I went the AMD route for this build as well -- the performance per dollar spent is just too good to pass up.
Nearly 20% lower TDP than comparable Intel chips, and at the time I was able to pick it up for nearly half the price of Intel's headliners.

--
PC
• AMD Ryzen 9 5950X
• EVGA GeForce RTX 3090 FTW3 Ultra Gaming, 1800 MHz, 24 GB GDDR6X
• 128 GB (4x32) Corsair Vengeance DDR4 3200 MHz, CL16
• ROG Crosshair VIII Dark Hero motherboard
• 2x 2 TB Samsung 980 Pro PCIe 4.0 Gen 4 NVMe M.2
• 1000 W Seasonic PRIME Ultra Titanium PSU
• 3x Noctua NF-A14 PWM 140mm front intake
• EK-AIO Elite 360 liquid cooler (6x 120mm fans, top exhaust)
• 1x Fractal Silent Series 140mm rear exhaust
--
Display
• 3x LG Ultragear 32" QHD (2560x1440) 165 Hz
--
Input -- just because gamers oft consider my setup blasphemous, lol
• Elecom HUGE Trackball
• Microsoft Trackball Explorer 1.0 (when I'm feelin' the old school ergo itch)
• Wacom Intuos Pro M pen/tablet
• DualShock 4 controller
• XBox 360 controller

... no "normal" mice (I like my wrists, thanks)
I do keep one on the edge of the desk, though, to be kind to others
--

Liquid cooling's fully dedicated to the CPU, both because my compile jobs regularly push it up to ~98°C even with all that cooling and because I don't want the hassle of dealing with the water block when I swap video cards.

Edit / Addition: While I know everyone raves about the 5800X3D for gaming, when it comes to -building- games, the extra 16 compile threads make a world of difference in the 5950X's favor

Daily driver OS is Manjaro, but I also keep Pop_OS! and Linux Mint installed for testing.

There's a separate laptop (ROG Zyphyrus, RTX 3070) on Windows half-time for when I absolutely must test on that horrid operating system, heh, and then an M1 MacBook Pro and a couple Mac Mini's for macOS/iOS builds/testing.
avatar
PaladinNO: If you're doing render work, the 3090 was a no-brainer at the time! That's actually something where the MSRP of the 3090 could be defended.
Indeed :)
Mainly, though, while I very much do use all that onboard RAM, it's about the over 10K CUDA cores --
It outperforms a Quadro A5000 for my render needs for $2000 less.

...really wanting a 4090 now, though, to play with some of those new AI features!
avatar
MuhEbola: But, ray tracing still is a gimmick, and i haven't bothered with it much at all, except just to see the differences now and again in some games.
Can't agree with you on that one. Ray tracing's fantastic.
I think most devs just haven't gotten a handle on how to best use it in their games yet.

But the newer Cycles Optix renderer in Blender with CUDA+RT enabled is amazing -- improved my realtime lighting workflow times literally 10 fold.

But RT purely for games?
I'll agree, that's another one on the "not yet worth the money" pile --
though it's not like you have much choice with Nvidia going full RTX.
avatar
MuhEbola: When everyone was scrambling to grab the 2xxx-series RTX cards, i grabbed two Titans as i always have, and even with both, ray tracing was just a gimmick. That's TWO Titan RTX cards mind you.
...
DLSS is nothing more than a life saver for those still using potatoes for computers.
I believe you commented in another thread that, unfortunately, SLI is super dead at this point -- and since it never worked right on Linux to begin with, I just ignore it. Though I have set up rigs in the past with multiple non-crossed cards to batch render jobs and/or allow single-system LAN parties, splitting the X sessions and locking multiple keyboards/controllers to their associated sessions/screens.

Remember LAN parties?
Remember when people actually wanted to game in the same room together?
I'm never going to understand the always-online gaming trend.
If it isn't couch co-op, I'm not playing with you.

I'm old enough that co-workers cringe when I say things like "I prefer phone calls to text messages"
Meh, is what it is, lol

DLSS is an interesting one. I think it's a good little bit of tech to have on hand as your card ages to keep playing the newest titles.

CP77's actually the first game I've ever -had to- enable it on to get the experience I want, though.
Sure, that upscaling's technically muddying the textures a bit, but due to the vivid and mostly flat color pallet and the game's quick pace, I barely even notice, and am happy to have the extra ~20 fps on average.

Of course, I'm limiting it a bit and maintaining an average card temp just under 70°C.
I've got real work for the card, I'm not trying to toast it (or heat the office) in my free time.
avatar
MuhEbola: The big one for me though, and as you'll already know by now as well, very few games are 'strolling simulators'. That for me is why ray tracing is pure gimmick.
This is a good point.

I've never really been into first person shooters, and that includes CP77. I'll throw the occasional knife, but the rest is all hacking. All those shiny guns go straight to the wider retail market :P

But, while walking simulators make up a very small portion of the gaming market, I really enjoy them, and with companies like Cyan getting back in the game, there might be some very good RT uses over the next few years in that space.

Real-time "god rays" in rural RPGs and fog-dense space sims also add a really nice touch for immersion's sake, and that goes double for VR environments.

But then there are also the "mathy bits" -- RT tech isn't limited to visible light.
It can be used for all sorts of wide-spectrum ray casts -- gauge object distance for millions of nodes, multi-point navigation, complex collision prediction, etc.

Developers have really just started scratching the surface of dedicated hardware ray-tracing's potential now that it's become widely available. Give it 5-10 years for everyone to catch up :)

avatar
PaladinNO: I've never cared about the graphics in a game - I value gameplay and story and lack of bugs way higher than how a game looks. Which is why I still greatly enjoy games like Half-Life 2 and Portal 2.
Very much this.
While I enjoy a talented artist, if the mechanics aren't there, the game's going in the trash.

I'm still playing HoMM3. Don't care if it's going on 25 years old, it's still one of the best in the genre.
Age of Wonders III is great too, but not just because "ooo 3D".

Regarding said quality, given the forum we're in and the OP topic, for as engaging as I've found Cyberpunk so far, whether it was stated or not, it's an obvious Sleeping Dogs clone that tried to address the common player complaint "why so few guns?" -- and in that light, I gotta say, for all the things they got right, some things are significantly lacking.

Notably, given the legacy, I'd have appreciated a little more (any?) effort put into hand-to-hand combat, and the driving is holy-omg-hell terrible.

Also, I don't know what's up with controller support, but it just feels off. I play most first-person titles with either the DS4 or Xbox controller, but it just doesn't play well in CP77. It feels really "jerky" -- Not sure if it's a smoothing issue or maybe some calibration problem going through the SDL/Wine layer -- but if it's config, it's not affecting any other titles, and KB+M plays fine. Is it just me, or does anyone else notice controller play feeling wonky?
Post edited January 28, 2023 by xixas
I think it should show which version the game they reviewed, and the current version should be listed on the store page. I think that would be more accurate on what the user reviewing the game experienced, and whether or not that might be different with the current version.
avatar
MuhEbola: I thought it was only toddlers that stamped their feet and screamed like that, and we're all supposed to grow out of that, but i can see this sadly hasn't happened for many.
You should see the complaints for Baldur's Gate 3 :)
avatar
MuhEbola: Now ask yourself, have you ever dusted off an old game, just to play it again with graphics overhaul mods?
The answer is probably a Yes! and for the same reasons as i listed above.
Sounds like I am a bit younger than you, but yes. While not many games, the one that stands out to me would be eDuke.
Duke Nukem 3D with improved grahics and features.

My first console was a NES, with - of course - Super Mario. And I think I went straight to SMB3, because I have no recollection of the first one. With English as my second language, the OG Zelda sadly didn't make much sense to little, young me back then.

I think playing games at a young age, and with computers ever getting better while I was growing up (i was personally very late to the "personal computer at home"-market), I think that is why I now don't really care about the graphics today.

I have so many memories of truly fun gaming moments from back when the games just didn't look better, and that attitude of "games don't have to look nice to be fun" have come with me all these years. I can still revisit Breath of Fire 2 (SNES game) on emulator and still giggle like a child because I remember how much fun I had with that game some 25 years ago.
avatar
xixas: Fun, spec shares.
Every "back in my day" gaming progression conversation ends up here eventually :)
It's the classic, and if don't mind me saying, obligatory E-peen discussion, especially when it's become clear that it would be actually high-end systems that would be compared. :)

Though with the "RTX 3090" card thrown down, I knew I couldn't really compete. ^^

avatar
xixas: Can't agree with you on that one. Ray tracing's fantastic.
I think most devs just haven't gotten a handle on how to best use it in their games yet.

But RT purely for games?
I'll agree, that's another one on the "not yet worth the money" pile --
though it's not like you have much choice with Nvidia going full RTX.

Developers have really just started scratching the surface of dedicated hardware ray-tracing's potential now that it's become widely available. Give it 5-10 years for everyone to catch up :)
Agreed. While I cannot speak for RT in a development scenario (I am not a developer or programmer), I can easily see RT eventually become the full-fledged replacement for light rasterization in games.

And the RTX 4090 is already getting very close to make the up-until-now RT gimmick in games not a gimmick anymore.
At the pace Nvidia is pushing the tech, in a few more years I think we will be there. Especially, as you say, developers haven't fully implemented and optimized the feature yet.

avatar
xixas: Remember LAN parties?
Remember when people actually wanted to game in the same room together?
I'm never going to understand the always-online gaming trend.
If it isn't couch co-op, I'm not playing with you.

I'm old enough that co-workers cringe when I say things like "I prefer phone calls to text messages"
Meh, is what it is, lol

I'm still playing HoMM3. Don't care if it's going on 25 years old, it's still one of the best in the genre.
Age of Wonders III is great too, but not just because "ooo 3D".
I remember LAN-parties. Good times. <3

I also remember a specific scenario where 2 people playing World of Warcraft together, sitting at spot 5 and 7, with a random guy in between, using the "whisper" feature to talk privately to each other instead of moving their assses out of their chairs and taking the 2 steps needed to physically communicate.

I have never been one for online games - despite my nickname which, no, is NOT from WoW, but from Age of Empires 2 (with "NO" being the country tag) - though I really, REALLY enjoy a good co-op game, or LAN gaming.

I play League of Legends as the only exception. Rest are solely PvE single-player games. Like Cyberpunk 2077, which I've spent like 600 hours on at this point, on my 4 playthroughs, and with so many mods now I fear I have lost track of them all.
Post edited January 28, 2023 by PaladinNO