It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
low rated
There is a video on Youtube by Digital Foundry about RDR2, where they analyze it on a technical level. They mention that they are testing Xbox One X, as it's the strongest console (at the time) and that that version runs at native 4K.

Except in another video, they mention that many of the game's settings on consoles of that generation are locked at beneath PC's low settings - meaning, even if you have a potato PC, it's not possible for you to replicate those.

This is a prime example of console gamers being gullible. They hear that a game is running at native 4K, and they jump on that fact, thinking it's some marvel... but don't even stop to think that other parts of the image get downgraded to achieve the "4K resolution".

Another example would be Cyberpunk 2077. The console gamers got outraged the game wasn't hitting high framerates on the so-called "8th gen". What? Like, I am sorry if you have any nostalgia towards your PS4, but by modern standards, even the Pro version (which in turn is much superior to the base PS4) is kind of trash in terms of hardware. According to Wikipedia, it uses something like 8GB of vRAM, which's pretty decent... until you realize it's DDR5, not 6. Like, how could you have 100% honestly expect the game to run at more than 1188p 25FPS? It's almost as if technology HAS moved on from 2013 (when the consoles were not even that high-end, anyway).

A very hilarious thing is that GTA V on PS3/X360 ran at a comparable framerate to Cyberpunk on PS4/XBONE. However, back then, console gamers were used to slideshow gaming (look it up, the PS2 version of MGS3 would sometimes hit 14-16 FPS), but in the decade since then, devs have gotten better at manipulating the gullible console gamers into believing they are playing a "high frame rate game running 100% optimally" using dirty tricks like the one I talked about with RDR2. So now that Cyberpunk has returned to the slideshow, console gamers are like, "wtf? My 8 year old hardware, which was pretty mid range even at the time of its release, isn't hitting 60 FPS??? WHAT???

Like, I am not that tech savvy myself, but how in *censored* could anyone honestly believe Cyberpunk would run at high frame rates with *censored" DDR5 vRAM?

It's time we accepted console gamers are gullible and that's why consoles keep selling well. People are underinformed about how slimy and underhanded consoles are.


Edit I apologize for any typos or other like mistakes,.but I typed it on my phone.
Post edited September 20, 2021 by Jon_Irenicus_PL
I suppose consoles are more a 'it's current year' and the price point, and that's all you have to worry about. Beyond that choosing games.

Back in 2000 or so i went with consoles, not because i didn't understand the tech, but i could see any chance of trying to keep up would be WAAAAAY too expensive. Though in 2002 i'd also gotten one computer gaming machine to play Morrowind and Diablo 2 (That's all that would fit) and that worked, at least until i went in the military. After my time in the military when the 360/PS3 were out i continued to primarily do console gaming, at least with big games but doing smaller easier games on my computer like ADOM and emulation of lower systems. It wasn't until about 2014 i switched back to computer gaming mainly because the hardware tech is high enough that it wasn't a $300 upgrade every year or two to do anything meaningful.

Hmmmm... Maybe we should back up, you have programmers and non-programmers. Programmers are fewer than 3% of the population. Tech savvy but not programmers are probably 10% or fewer. Everyone else is, i hate to say not really capable. As such the non-tech people see things like a 3 prong plug. Match the plug to the hole and put it in and it works. And that's as far as it goes. (Though some will try to buy PS4 games to put in their XBone and wonder why it doesn't work even though it's clearly labeled, or some will buy a N64 cartridge and cut it down to size to fit in the floppy drive then wonder why it won't run on the laptop)
Post edited September 20, 2021 by rtcvb32
The irony is that back when the 3d capable consoles first came out they were competitive against PCs.The same goes for the PS3.
Even knowing about different types of RAM makes you more 'tech-savvy' than (at a conservative estimate) 95% of gamers, regardless of PC or console.

Personally, I do most of my gaming on consoles these days. I'm not gullible - I work in IT and do enough techy stuff for my day job. I can't be arsed with dealing with a gaming PC at home.
We all have different priorities. Some value ease of use above graphics, hardware, and specs.
I've certainly been on both sides of console/pc argument at different points in my life. I'm currently a pc guy but I totally understand why people choose the ease of consoles. That said, I think there is an even greater ease in just buying the messaging of console makers and not investigating how the graphical output really stacks up against fully stocked pcs. Not everyone has the capability or even desire to the work necessary to get the most out of their electronics.
This is outdated but years ago console gaming was also more convenient for sharing among friends. Tech savvy or not. If you're going to a sleepover it's much easier to grab that copy of Golden Eye and toss it in your bag. You could certainly bring a PC game but even if your friends specs were compatible you get to go through the install process. You most likely wouldn't have your saved data either.

As I said not really relevant today but in the past that was another part of the console vs computer debate. I just love games. They come however they come and I try and not destroy my wallet.
Post edited September 22, 2021 by Mplath1