It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Cavalary: And here I thought you were linking to the Powerwolf one :))
You get to see stuff live in concert and then forget they even existed. Must be getting old...

avatar
Cavalary: (Which thought led me to find out they had a new version of it and a video... but couldn't see it because of something else that was just mentioned around here. Huh.
Why? Is it any "worse" than their "Demons Are A Girl's Best Friend" video? :P

Edit: Thanks to reaction videos, I can confirm there is some light nudity in it (breasts). Not credit-card check worthy, IMHO, but then again I'm not a Youtube censor. Could be the record label just playing it safe.
Post edited June 08, 2021 by WinterSnowfall
Asking for magic to happen are we ? It doesn't exist so you cannot make every game run in an acceptable state on let's say Intel iGPUs.
Post edited June 08, 2021 by ChrisGamer300
As someone who struggled for years with inadequate PCs for gaming, I understand where OP is coming from. It's often surprising how a game works decently on a low end system and another one doesn't.

What OP is ignoring is that there are mods that help games run on inadequate systems. LowSpecGamer did a video specifically about The Witcher 2 that could help https://youtu.be/T67H6WuMtnQ

And there are tools that lock the framerate, like RivaTuner. As absurd as it may sound to someone running games at 120hz or higher, locking the game at 30 or even 20 can make the game feel better to play.
For me it would be enough that in the games they make the minimum or potato settings useful and not so bad looking as use to be as a norm. The fact is that nobody uses them for that reason. And keep the high settings for the computers it is intended for.

I have the feeling that Higher settings use to be the director's cut developers really want the game to be played. And it should be the graphics options for the near future. And of course be advised clearly in that way.
Post edited June 08, 2021 by Gudadantza
avatar
samuraigaiden: And there are tools that lock the framerate, like RivaTuner. As absurd as it may sound to someone running games at 120hz or higher, locking the game at 30 or even 20 can make the game feel better to play.
Heh, I'm quite used to getting around 20 fps, maybe even less. Also having always had poor PCs for gaming, and emphasizing CPU and RAM in the budget, with the video card rather getting leftovers (and in case of the current one, nothing at all, since I also stayed with integrated graphics), I typically tend to play with the graphics settings, especially if more complex options are available, until it gets there or thereabouts most of the time, with a potential for pretty significant stuttering if a lot is going on, and consider it fine.
I think I could go one step further:
* Still keep the idea that games should be playable, with some settings, on integrated graphics.
* But further, if feasible, make sure the game is playable with software rendering (llvmpipe or similar); this may not be possible with all games, but for some games (particularly those with retro graphics), that should be the goal.
* The game should be tested in these configurations.
* (Extra credit: Get the game to run on a Raspberry Pi.)

Edit: Also, as I have said at some point, the graphics settings should come with help text that indicates what options are best for performance, to make it easier for those on low-end systems to find a playable combination of settings.
Post edited June 08, 2021 by dtgreene
avatar
dtgreene: * But further, if feasible, make sure the game is playable with software rendering (llvmpipe or similar); this may not be possible with all games, but for some games (particularly those with retro graphics), that should be the goal.
From an academic perspective, it's an interesting idea. From a practicality perspective, when was the last time a graphics card (or integrated graphics solution) was produced that didn't support hardware acceleration? Even the Pi has an openGL compatible VideoCore, and most other similar products use a VideoCore or PowerVR accelerator (i.e. pretty much every commercial product does hardware acceleration)
avatar
dtgreene: * But further, if feasible, make sure the game is playable with software rendering (llvmpipe or similar); this may not be possible with all games, but for some games (particularly those with retro graphics), that should be the goal.
avatar
pds41: From an academic perspective, it's an interesting idea. From a practicality perspective, when was the last time a graphics card (or integrated graphics solution) was produced that didn't support hardware acceleration? Even the Pi has an openGL compatible VideoCore, and most other similar products use a VideoCore or PowerVR accelerator (i.e. pretty much every commercial product does hardware acceleration)
Thing is, the Raspberry Pi only supports OpenGL ES at any decent version (up to 3.1 for the Pi 4). Full OpenGL appears to only support up to 2.1, whereas LLVMpipe allows OpenGL 3.3 apps to run (albeit slowly). So, there's a compatibility issue here.

(Note that this is on current Raspberry Pi OS. There's apparently a Vulkan 1.0 driver for the Pi 4 (could one run Zink on it?), but it's not yet in the "default" OS, though there's apparently plans to add it later this year.)

(Developing on a Raspberry Pi might be a way to ensure that your work is OpenGL ES compatible.)
Good sentiments, though without market research on measuring demand of consumers through willingness-to-pay (WTP), these are moot points. My suspicion is that people who don't own a $300+ GPU aren't the type of person to buy a $60 game at launch even if their iGPUs could play it since there are tons of other decently free options out there that are funner or run more optimized. But alas, I can't prove this either without doing conducting research myself.

If no research can be done, you'd have to look at case examples of games with low iGPU requirements that still sold extremely well. Specifically, $60 USD game that can run on iGPUs easily and compare the sales based on hardware demographics. The next best case example is Minecraft ($27 USD), but it's also not graphically demanding and I'm not sure if there's a breakdown of this hardware demographic out there.

EDIT: I guess AAA visual novels should also be in this category, but it's too niche to extrapolate to the rest of the gaming population.
Post edited June 09, 2021 by Canuck_Cat
avatar
Canuck_Cat: Good sentiments, though without market research on measuring demand of consumers through willingness-to-pay (WTP), these are moot points. My suspicion is that people who don't own a $300+ GPU aren't the type of person to buy a $60 game at launch even if their iGPUs could play it since there are tons of other decently free options out there that are funner or run more optimized. But alas, I can't prove this either without doing conducting research myself.

If no research can be done, you'd have to look at case examples of games with low iGPU requirements that still sold extremely well. Specifically, $60 USD game that can run on iGPUs easily and compare the sales based on hardware demographics. The next best case example is Minecraft ($27 USD), but it's also not graphically demanding and I'm not sure if there's a breakdown of this hardware demographic out there.

EDIT: I guess AAA visual novels should also be in this category, but it's too niche to extrapolate to the rest of the gaming population.
If you want to port your game to mobile platforms, which could be a good source of money, it needs to have relatively low system requirements, or it won't work.

Similarly, the Nintendo Switch apparently has weaker hardware; apparently the devs of Cathedral, and many other games, have had to optimize their games in order to get them to run acceptably on the Switch, and some of them still don't run well (I've heard about Bloodstined not running well on that platform).
avatar
Canuck_Cat: Good sentiments, though without market research on measuring demand of consumers through willingness-to-pay (WTP), these are moot points. My suspicion is that people who don't own a $300+ GPU aren't the type of person to buy a $60 game at launch even if their iGPUs could play it since there are tons of other decently free options out there that are funner or run more optimized. But alas, I can't prove this either without doing conducting research myself.
No need for research. The mass of available systems with integrated graphics is incredibly higher than those with discrete graphics.

And yes, we might not purchase a USD 60 game at launch, but still. Look at games such as Control o The Witcher 3 and the prices GOG asks for them right now (summer sale).
As technology progresses, so will the APUs.

I do assume that APU speeds will get even faster in less time. AMD is going to release Cezanne-type APUs for PCs soon, and reading up on the specifics I feel that they're already capable in handling most games. AMD is already in a good position to make APUs their hallmark processors. They're really cheap to produce, easy to plug in any compatible Motherboard and really powerful as well. Only thing they need is tons of RAM.

I would bet that in 2 years time, you will be able to purchase onboard GPU processors powerful enough to play RDR2 on 1080p at 120 FPS.
I was meaning to resurrect this thread but some people thankfully did that. Great discussion overall. If anything, my point is even more relevant now.

People need to understand that for many people a simple notebook PC with integrated graphics is the only general purpose computer they will ever have, be that for work, play, leisure, surfing the Net, etc. Spending hundreds/thousands of USD just for purchasing a PC with a discrete GPU (which might not even be mobile as their laptops) is completely out of the question.

And yet we have now good games with affordable prices (see for example the GOG sale which is ongoing right now). Joe User can certainly fork USD 10 to purchase a great game on sale right now, but his system is completely out of spec.

Now this is even more relevant, since in addition to all these reasons, right now discrete GPUs are becoming ridiculously expensive due to the crypto mining craze, which in turn makes that even more people purchase PCs with onboard graphics.

There's no way around this: Game studios are shooting themselves in the foot by excluding the millions of PCs with integrated graphics. Please make games with reasonable minimum requirements!
avatar
thegreyshadow: People need to understand that for many people a simple notebook PC with integrated graphics is the only general purpose computer they will ever have, be that for work, play, leisure, surfing the Net, etc. Spending hundreds/thousands of USD just for purchasing a PC with a discrete GPU (which might not even be mobile as their laptops) is completely out of the question.
Especially now, given that:
* GPU prices are massively inflated, to the point where even a semi-decent GPU can, alone, cost more than an entire GPU-less system (and not even a weak one; I'm talking a decent CPU, 16+GB RAM, decent sized SSD, that sort of thing).
* Being able to afford these inflated prices is not enough if you want a high-end one; there's the problem of actually finding one in stock. (At Newegg, you need to enter a lottery to win even the opportunity to purchase one, still at inflated prices and often bundled with something else you don't want.)
avatar
dtgreene: * GPU prices are massively inflated, to the point where even a semi-decent GPU can, alone, cost more than an entire GPU-less system (and not even a weak one; I'm talking a decent CPU, 16+GB RAM, decent sized SSD, that sort of thing).
"semi-decent" is subjective - it all depends on what you're trying to achieve. My GTX970 can still play A LOT of recent-ish* games on high settings at 1440P. Second hand price (over here): 150-200 euro - Try buying a new, complete system for that.

*Depends on the game. Some games from 2020 (e.g.) can have lower (GPU) requirements than some others from (e.g.) 2015.
There are just too many generalizations in this topic.
Post edited June 10, 2021 by teceem